WorldWideScience

Sample records for reaction environments computational

  1. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  2. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  3. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  4. Modelling human behaviours and reactions under dangerous environment

    OpenAIRE

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions...

  5. CHPS IN CLOUD COMPUTING ENVIRONMENT

    OpenAIRE

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  6. Identifying Reaction Pathways and their Environments

    DEFF Research Database (Denmark)

    Maronsson, Jon Bergmann

    Finding the mechanisms and estimating the rate of chemical reactions is an essential part of modern research of atomic scale systems. In this thesis, the application of well established methods for reaction rates and paths to important systems for hydrogen storage is considered before developing...... extensions to further identify the reaction environment for a more accurate rate. Complex borohydrides are materials of high hydrogen storage capacity and high thermodynamic stability (too high for hydrogen storage). In an effort to gain insight into the structural transitions of two such materials, Ca(BH4......-interstitial defects. In good agreement with the experiments, C3-type rotations activate at lower temperature than C2-type rotations. In order to investigate the environment of reaction pathways, a method for finding the ridge between first order saddle points on a multidimensional surface was developed...

  7. Printing in Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Karapantelakis, Athanasios; Delvic, Alisa; Zarifi Eslami, Mohammed; Khamit, Saltanat

    Document printing has long been considered an indispensable part of the workspace. While this process is considered trivial and simple for environments where resources are ample (e.g. desktop computers connected to printers within a corporate network), it becomes complicated when applied in a mobile

  8. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  9. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  10. Elucidating reaction mechanisms on quantum computers.

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  11. Airborne Cloud Computing Environment (ACCE)

    Science.gov (United States)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  12. Computer simulation for sodium-concrete reactions

    International Nuclear Information System (INIS)

    Zhang Bin; Zhu Jizhou

    2006-01-01

    In the liquid metal cooled fast breeder reactors (LMFBRs), direct contacts between sodium and concrete is unavoidable. Due to sodium's high chemical reactivity, sodium would react with concrete violently. Lots of hydrogen gas and heat would be released then. This would harm the ignorantly of the containment. This paper developed a program to simualte sodium-conrete reactions across-the-board. It could give the reaction zone temperature, pool temperature, penetration depth, penetration rate, hydrogen flux and reaction heat and so on. Concrete was considered to be composed of silica and water only in this paper. The variable, the quitient of sodium hydroxide, was introduced in the continuity equation to simulate the chemical reactions more realistically. The product of the net gas flux and boundary depth was ably transformed to that of penetration rate and boundary depth. The complex chemical kinetics equations was simplified under some hypothesises. All the technique applied above simplified the computer simulation consumedly. In other words, they made the computer simulation feasible. Theoretics models that applied in the program and the calculation procedure were expatiated in detail. Good agreements of an overall transient behavior were obtained in the series of sodium-concrete reaction experiment analysis. The comparison between the analytical and experimental results showed the program presented in this paper was creditable and reasonable for simulating the sodium-concrete reactions. This program could be used for nuclear safety judgement. (authors)

  13. Reaction-Diffusion Automata Phenomenology, Localisations, Computation

    CERN Document Server

    Adamatzky, Andrew

    2013-01-01

    Reaction-diffusion and excitable media are amongst most intriguing substrates. Despite apparent simplicity of the physical processes involved the media exhibit a wide range of amazing patterns: from target and spiral waves to travelling localisations and stationary breathing patterns. These media are at the heart of most natural processes, including morphogenesis of living beings, geological formations, nervous and muscular activity, and socio-economic developments.   This book explores a minimalist paradigm of studying reaction-diffusion and excitable media using locally-connected networks of finite-state machines: cellular automata and automata on proximity graphs. Cellular automata are marvellous objects per se because they show us how to generate and manage complexity using very simple rules of dynamical transitions. When combined with the reaction-diffusion paradigm the cellular automata become an essential user-friendly tool for modelling natural systems and designing future and emergent computing arch...

  14. Computational Approach to Electron Charge Transfer Reactions

    DEFF Research Database (Denmark)

    Jónsson, Elvar Örn

    -molecular mechanics scheme, and tools to analyse statistical data and generate relative free energies and free energy surfaces. The methodology is applied to several charge transfer species and reactions in chemical environments - chemical in the sense that solvent, counter ions and substrate surfaces are taken...... in to account - which directly influence the reactants and resulting reaction through both physical and chemical interactions. All methods are though general and can be applied to different types of chemistry. First, the basis of the various theoretical tools is presented and applied to several test systems...... and asymmetric charge transfer reactions between several first-row transition metals in water. The results are compared to experiments and rationalised with classical analytic expressions. Shortcomings of the methods are accounted for with clear steps towards improved accuracy. Later the analysis is extended...

  15. Vanderbilt University: Campus Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    Despite the decentralized nature of computing at Vanderbilt, there is significant evidence of cooperation and use of each other's resources by the various computing entities. Planning for computing occurs in every school and department. Caravan, a campus-wide network, is described. (MLW)

  16. Modeling human behaviors and reactions under dangerous environment.

    Science.gov (United States)

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  17. Plasmon-driven sequential chemical reactions in an aqueous environment.

    Science.gov (United States)

    Zhang, Xin; Wang, Peijie; Zhang, Zhenglong; Fang, Yurui; Sun, Mengtao

    2014-06-24

    Plasmon-driven sequential chemical reactions were successfully realized in an aqueous environment. In an electrochemical environment, sequential chemical reactions were driven by an applied potential and laser irradiation. Furthermore, the rate of the chemical reaction was controlled via pH, which provides indirect evidence that the hot electrons generated from plasmon decay play an important role in plasmon-driven chemical reactions. In acidic conditions, the hot electrons were captured by the abundant H(+) in the aqueous environment, which prevented the chemical reaction. The developed plasmon-driven chemical reactions in an aqueous environment will significantly expand the applications of plasmon chemistry and may provide a promising avenue for green chemistry using plasmon catalysis in aqueous environments under irradiation by sunlight.

  18. Embedding Moodle into Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus

    2010-01-01

    Glahn, C., & Specht, M. (2010). Embedding Moodle into Ubiquitous Computing Environments. In M. Montebello, et al. (Eds.), 9th World Conference on Mobile and Contextual Learning (MLearn2010) (pp. 100-107). October, 19-22, 2010, Valletta, Malta.

  19. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  20. Intelligent computing for sustainable energy and environment

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang [Queen' s Univ. Belfast (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Li, Shaoyuan; Li, Dewei [Shanghai Jiao Tong Univ., Shanghai (China). Dept. of Automation; Niu, Qun (eds.) [Shanghai Univ. (China). School of Mechatronic Engineering and Automation

    2013-07-01

    Fast track conference proceedings. State of the art research. Up to date results. This book constitutes the refereed proceedings of the Second International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2012, held in Shanghai, China, in September 2012. The 60 full papers presented were carefully reviewed and selected from numerous submissions and present theories and methodologies as well as the emerging applications of intelligent computing in sustainable energy and environment.

  1. Reaction Diffusion Voronoi Diagrams: From Sensors Data to Computing

    Directory of Open Access Journals (Sweden)

    Alejandro Vázquez-Otero

    2015-05-01

    Full Text Available In this paper, a new method to solve computational problems using reaction diffusion (RD systems is presented. The novelty relies on the use of a model configuration that tailors its spatiotemporal dynamics to develop Voronoi diagrams (VD as a part of the system’s natural evolution. The proposed framework is deployed in a solution of related robotic problems, where the generalized VD are used to identify topological places in a grid map of the environment that is created from sensor measurements. The ability of the RD-based computation to integrate external information, like a grid map representing the environment in the model computational grid, permits a direct integration of sensor data into the model dynamics. The experimental results indicate that this method exhibits significantly less sensitivity to noisy data than the standard algorithms for determining VD in a grid. In addition, previous drawbacks of the computational algorithms based on RD models, like the generation of volatile solutions by means of excitable waves, are now overcome by final stable states.

  2. Computed potential energy surfaces for chemical reactions

    Science.gov (United States)

    Walch, Stephen P.

    1988-01-01

    The minimum energy path for the addition of a hydrogen atom to N2 is characterized in CASSCF/CCI calculations using the (4s3p2d1f/3s2p1d) basis set, with additional single point calculations at the stationary points of the potential energy surface using the (5s4p3d2f/4s3p2d) basis set. These calculations represent the most extensive set of ab initio calculations completed to date, yielding a zero point corrected barrier for HN2 dissociation of approx. 8.5 kcal mol/1. The lifetime of the HN2 species is estimated from the calculated geometries and energetics using both conventional Transition State Theory and a method which utilizes an Eckart barrier to compute one dimensional quantum mechanical tunneling effects. It is concluded that the lifetime of the HN2 species is very short, greatly limiting its role in both termolecular recombination reactions and combustion processes.

  3. The reaction environment in a filter-press laboratory reactor: the FM01-LC flow cell

    International Nuclear Information System (INIS)

    Rivera, Fernando F.; León, Carlos Ponce de; Walsh, Frank C.; Nava, José L.

    2015-01-01

    A parallel plate cell facilitating controlled flow in a rectangular channel and capable of incorporating a wide range of electrode materials is important in studies of electrode reactions prior to process development and scale-up. The FM01-LC, a versatile laboratory-scale, plane parallel filter-press type electrochemical cell (having a projected electrode area of 64 cm 2 ) which is based on the larger FM21-SP electrolyser (2100 cm 2 area). Many laboratories have used this type of reactor to quantify the importance of reaction environment in fundamental studies and to prepare for industrial applications. A number of papers have concerned the experimental characterization and computational modelling of its reaction environment but the experimental and computational data has become dispersed. The cell has been used in a diverse range of synthesis and processing applications which require controlled flow and known reaction environment. In a previous review, the cell construction and reaction environment was summarised followed by the illustration of its use for a range of applications that include organic and inorganic electrosynthesis, metal ion removal, energy storage, environmental remediation (e.g., metal recycling or anodic destruction of organics) and drinking water treatment. This complementary review considers the characteristics of the FM01-LC electrolyser as an example of a well-engineered flow cell facilitating cell scale-up and provides a rigorous analysis of its reaction environment. Particular aspects include the influence of electrolyte velocity on mass transport rates, flow dispersion and current distribution

  4. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  5. Physiological environment induce quick response - slow exhaustion reactions

    Directory of Open Access Journals (Sweden)

    Noriko eHiroi

    2011-09-01

    Full Text Available In vivo environments are highly crowded and inhomogeneous, which may affect reaction processes in cells. In this study we examined the effects of intracellular crowding and an inhomogeneity on the behavior of in vivo reactions by calculating the spectral dimension (ds, which can be translated into the reaction rate function. We compared estimates of anomaly parameters obtained from Fluorescence Correlation Spectroscopy (FCS data with fractal dimensions derived from Transmission Electron Microscopy (TEM image analysis. FCS analysis indicated that the anomalous property was linked to physiological structure. Subsequent TEM analysis provided an in vivo illustration; soluble molecules likely percolate between intracellular clusters, which are constructed in a self-organizing manner. We estimated a cytoplasmic spectral dimension ds to be 1.39 ± 0.084. This result suggests that in vivo reactions initially run faster than the same reactions in a homogeneous space; this conclusion is consistent with the anomalous character indicated by FCS analysis. We further showed that these results were compatible with our Monte-Carlo simulation in which the anomalous behavior of mobile molecules correlates with the intracellular environment, leading to description as a percolation cluster, as demonstrated using TEM analysis. We confirmed by the simulation that the above-mentioned in vivo like properties are different from those of homogeneously concentrated environments. Additionally, simulation results indicated that crowding level of an environment might affect diffusion rate of reactant. Such knowledge of the spatial information enables us to construct realistic models for in vivo diffusion and reaction systems.

  6. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  7. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  8. Scheduling multimedia services in cloud computing environment

    Science.gov (United States)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  9. Computational prediction of chemical reactions: current status and outlook.

    Science.gov (United States)

    Engkvist, Ola; Norrby, Per-Ola; Selmi, Nidhal; Lam, Yu-Hong; Peng, Zhengwei; Sherer, Edward C; Amberg, Willi; Erhard, Thomas; Smyth, Lynette A

    2018-06-01

    Over the past few decades, various computational methods have become increasingly important for discovering and developing novel drugs. Computational prediction of chemical reactions is a key part of an efficient drug discovery process. In this review, we discuss important parts of this field, with a focus on utilizing reaction data to build predictive models, the existing programs for synthesis prediction, and usage of quantum mechanics and molecular mechanics (QM/MM) to explore chemical reactions. We also outline potential future developments with an emphasis on pre-competitive collaboration opportunities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Computing, Environment and Life Sciences | Argonne National Laboratory

    Science.gov (United States)

    Computing, Environment and Life Sciences Research Divisions BIOBiosciences CPSComputational Science DSLData Argonne Leadership Computing Facility Biosciences Division Environmental Science Division Mathematics and Computer Science Division Facilities and Institutes Argonne Leadership Computing Facility News Events About

  11. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  12. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  13. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  14. System administration of ATLAS TDAQ computing environment

    Science.gov (United States)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  15. Xcache in the ATLAS Distributed Computing Environment

    CERN Document Server

    Hanushevsky, Andrew; The ATLAS collaboration

    2018-01-01

    Built upon the Xrootd Proxy Cache (Xcache), we developed additional features to adapt the ATLAS distributed computing and data environment, especially its data management system RUCIO, to help improve the cache hit rate, as well as features that make the Xcache easy to use, similar to the way the Squid cache is used by the HTTP protocol. We are optimizing Xcache for the HPC environments, and adapting the HL-LHC Data Lakes design as its component for data delivery. We packaged the software in CVMFS, in Docker and Singularity containers in order to standardize the deployment and reduce the cost to resolve issues at remote sites. We are also integrating it into RUCIO as a volatile storage systems, and into various ATLAS workflow such as user analysis,

  16. Astrophysical Nuclear Reaction Rates in the Dense Metallic Environments

    Science.gov (United States)

    Kilic, Ali Ihsan

    2017-09-01

    Nuclear reaction rates can be enhanced by many orders of magnitude in dense and relatively cold astrophysical plasmas such as in white dwarfs, brown dwarfs, and giant planets. Similar conditions are also present in supernova explosions where the ignition conditions are vital for cosmological models. White dwarfs are compact objects that have both extremely high interior densities and very strong local magnetic fields. For the first time, a new formula has been developed to explain cross section and reaction rate quantities for light elements that includes not only the nuclear component but also the material dependence, magnetic field, and crystal structure dependency in dense metallic environments. I will present the impact of the developed formula on the cross section and reaction rates for light elements. This could have possible technological applications in energy production using nuclear fusion reactions.

  17. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  18. Printing in heterogeneous computer environment at DESY

    International Nuclear Information System (INIS)

    Jakubowski, Z.

    1996-01-01

    The number of registered hosts DESY reaches 3500 while the number of print queues approaches 150. The spectrum of used computing environment is very wide: from MAC's and PC's, through SUN, DEC and SGI machines to the IBM mainframe. In 1994 we used 18 tons of paper. We present a solution for providing print services in such an environment for more than 3500 registered users. The availability of the print service is a serious issue. Using centralized printing has a lot of advantages for software administration but creates single point of failure. We solved this problem partially without using expensive software and hardware. The talk provides information about the DESY central central print spooler concept. None of the systems available on the market provides ready to use reliable solution for all platforms used for DESY. We discuss concepts for installation, administration and monitoring large number of printers. We found a solution for printing both on central computing facilities likewise for support of stand-alone workstations. (author)

  19. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  20. Accurate atom-mapping computation for biochemical reactions.

    Science.gov (United States)

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  1. ComputerApplications and Virtual Environments (CAVE)

    Science.gov (United States)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  2. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  3. Ubiquitous computing in shared-care environments.

    Science.gov (United States)

    Koch, S

    2006-07-01

    In light of future challenges, such as growing numbers of elderly, increase in chronic diseases, insufficient health care budgets and problems with staff recruitment for the health-care sector, information and communication technology (ICT) becomes a possible means to meet these challenges. Organizational changes such as the decentralization of the health-care system lead to a shift from in-hospital to both advanced and basic home health care. Advanced medical technologies provide solutions for distant home care in form of specialist consultations and home monitoring. Furthermore, the shift towards home health care will increase mobile work and the establishment of shared care teams which require ICT-based solutions that support ubiquitous information access and cooperative work. Clinical documentation and decision support systems are the main ICT-based solutions of interest in the context of ubiquitous computing for shared care environments. This paper therefore describes the prerequisites for clinical documentation and decision support at the point of care, the impact of mobility on the documentation process, and how the introduction of ICT-based solutions will influence organizations and people. Furthermore, the role of dentistry in shared-care environments is discussed and illustrated in the form of a future scenario.

  4. Computer-assisted mechanistic evaluation of organic reactions

    Energy Technology Data Exchange (ETDEWEB)

    Gushurst, A.J.

    1988-01-01

    CAMEO, an interactive computer program which predicts the products of organic reactions given starting materials and conditions, has been refined and extended in the area of base-catalyzed and nucleophilic processes. The present capabilities of the program are outlined including brief discussion on the major segments in CAMEO: graphics, perception, and reaction evaluation. The implementation of general algorithms for predicting the acidities of a vast number of organic compounds to within 2 pK{sub a} units in dimethylsulfoxide and water are then described, followed by a presentation of the reactivity rules used by the program to evaluate nucleophilc reactions. Finally, a treatment of sulfur and phosphorus ylides, iminophosphoranes, and P=X-activated anions is given illuminating the various competitions available for these reagents, such as between proton transfer and addition, 1,2- and 1,4-addition, and the Peterson, Wittig, and Horner-Emmons olefination reactions.

  5. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  6. Specialized Computer Systems for Environment Visualization

    Science.gov (United States)

    Al-Oraiqat, Anas M.; Bashkov, Evgeniy A.; Zori, Sergii A.

    2018-06-01

    The need for real time image generation of landscapes arises in various fields as part of tasks solved by virtual and augmented reality systems, as well as geographic information systems. Such systems provide opportunities for collecting, storing, analyzing and graphically visualizing geographic data. Algorithmic and hardware software tools for increasing the realism and efficiency of the environment visualization in 3D visualization systems are proposed. This paper discusses a modified path tracing algorithm with a two-level hierarchy of bounding volumes and finding intersections with Axis-Aligned Bounding Box. The proposed algorithm eliminates the branching and hence makes the algorithm more suitable to be implemented on the multi-threaded CPU and GPU. A modified ROAM algorithm is used to solve the qualitative visualization of reliefs' problems and landscapes. The algorithm is implemented on parallel systems—cluster and Compute Unified Device Architecture-networks. Results show that the implementation on MPI clusters is more efficient than Graphics Processing Unit/Graphics Processing Clusters and allows real-time synthesis. The organization and algorithms of the parallel GPU system for the 3D pseudo stereo image/video synthesis are proposed. With realizing possibility analysis on a parallel GPU-architecture of each stage, 3D pseudo stereo synthesis is performed. An experimental prototype of a specialized hardware-software system 3D pseudo stereo imaging and video was developed on the CPU/GPU. The experimental results show that the proposed adaptation of 3D pseudo stereo imaging to the architecture of GPU-systems is efficient. Also it accelerates the computational procedures of 3D pseudo-stereo synthesis for the anaglyph and anamorphic formats of the 3D stereo frame without performing optimization procedures. The acceleration is on average 11 and 54 times for test GPUs.

  7. A computational glance at organometallic cyclizations and coupling reactions

    OpenAIRE

    Fiser, Béla

    2016-01-01

    210 p. Organometallic chemistry is one of the main research topics in chemical science.Nowadays, organometallic reactions are the subject of intensive theoretical investigations.However, in many cases, only joint experimental and theoretical effortscould reveal the answers what we are looking for.The fruits of such experimental and theoretical co-operations will be presentedhere. In this work, we are going to deal with homogeneous organometallic catalysisusing computational chemical tools....

  8. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  9. The sociability of computer-supported collaborative learning environments

    NARCIS (Netherlands)

    Kreijns, C.J.; Kirschner, P.A.; Jochems, W.M.G.

    2002-01-01

    There is much positive research on computer-supported collaborative learning (CSCL) environments in asynchronous distributed learning groups (DLGs). There is also research that shows that contemporary CSCL environments do not completely fulfil expectations on supporting interactive group learning,

  10. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  11. InSAR Scientific Computing Environment

    Science.gov (United States)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and

  12. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    CERN Document Server

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  13. High performance computing network for cloud environment using simulators

    OpenAIRE

    Singh, N. Ajith; Hemalatha, M.

    2012-01-01

    Cloud computing is the next generation computing. Adopting the cloud computing is like signing up new form of a website. The GUI which controls the cloud computing make is directly control the hardware resource and your application. The difficulty part in cloud computing is to deploy in real environment. Its' difficult to know the exact cost and it's requirement until and unless we buy the service not only that whether it will support the existing application which is available on traditional...

  14. Radiolytic oxidation of propane: computer modeling of the reaction scheme

    International Nuclear Information System (INIS)

    Gupta, A.K.; Hanrahan, R.J.

    1991-01-01

    The oxidation of gaseous propane under gamma radiolysis was studied at 100 torr pressure and 25 o C, at oxygen pressures from 1 to 15 torr. Major oxygen-containing products and their G-values with 10% added oxygen are as follows: acetone, 0.98; i-propyl alcohol, 0.86; propionaldehyde, 0.43; n-propyl alcohol, 0.11; acrolein, 0.14; and allyl alcohol, 0.038. The formation of major oxygen-containing products was explained on the basis that the alkyl radicals combine with molecular oxygen to give peroxyl radicals; the peroxyl radicals react with one another to give alkoxyl radicals, which in turn react with one another to form carbonyl compounds and alcohols. The reaction scheme for the formation of major products was examined using computer modeling based on a mechanism involving 28 reactions. Yields could be brought into agreement with the data within experimental error in nearly all cases. (author)

  15. Computers and the Environment: Minimizing the Carbon Footprint

    Science.gov (United States)

    Kaestner, Rich

    2009-01-01

    Computers can be good and bad for the environment; one can maximize the good and minimize the bad. When dealing with environmental issues, it's difficult to ignore the computing infrastructure. With an operations carbon footprint equal to the airline industry's, computer energy use is only part of the problem; everyone is also dealing with the use…

  16. Ubiquitous Computing in Physico-Spatial Environments

    DEFF Research Database (Denmark)

    Dalsgård, Peter; Eriksson, Eva

    2007-01-01

    Interaction design of pervasive and ubiquitous computing (UC) systems must take into account physico-spatial issues as technology is implemented into our physical surroundings. In this paper we discuss how one conceptual framework for understanding interaction in context, Activity Theory (AT...

  17. THE VALUE OF CLOUD COMPUTING IN THE BUSINESS ENVIRONMENT

    OpenAIRE

    Mircea GEORGESCU; Marian MATEI

    2013-01-01

    Without any doubt, cloud computing has become one of the most significant trends in any enterprise, not only for IT businesses. Besides the fact that the cloud can offer access to low cost, considerably flexible computing resources, cloud computing also provides the capacity to create a new relationship between business entities and corporate IT departments. The value added to the business environment is given by the balanced use of resources, offered by cloud computing. The cloud mentality i...

  18. Research computing in a distributed cloud environment

    International Nuclear Information System (INIS)

    Fransham, K; Agarwal, A; Armstrong, P; Bishop, A; Charbonneau, A; Desmarais, R; Hill, N; Gable, I; Gaudet, S; Goliath, S; Impey, R; Leavett-Brown, C; Ouellete, J; Paterson, M; Pritchet, C; Penfold-Brown, D; Podaima, W; Schade, D; Sobie, R J

    2010-01-01

    The recent increase in availability of Infrastructure-as-a-Service (IaaS) computing clouds provides a new way for researchers to run complex scientific applications. However, using cloud resources for a large number of research jobs requires significant effort and expertise. Furthermore, running jobs on many different clouds presents even more difficulty. In order to make it easy for researchers to deploy scientific applications across many cloud resources, we have developed a virtual machine resource manager (Cloud Scheduler) for distributed compute clouds. In response to a user's job submission to a batch system, the Cloud Scheduler manages the distribution and deployment of user-customized virtual machines across multiple clouds. We describe the motivation for and implementation of a distributed cloud using the Cloud Scheduler that is spread across both commercial and dedicated private sites, and present some early results of scientific data analysis using the system.

  19. Security in cloud computing and virtual environments

    OpenAIRE

    Aarseth, Raymond

    2015-01-01

    Cloud computing is a big buzzwords today. Just watch the commercials on TV and I can promise that you will hear the word cloud service at least once. With the growth of cloud technology steadily rising, and everything from cellphones to cars connected to the cloud, how secure is cloud technology? What are the caveats of using cloud technology? And how does it all work? This thesis will discuss cloud security and the underlying technology called Virtualization to ...

  20. Distributed computing environment for Mine Warfare Command

    OpenAIRE

    Pritchard, Lane L.

    1993-01-01

    Approved for public release; distribution is unlimited. The Mine Warfare Command in Charleston, South Carolina has been converting its information systems architecture from a centralized mainframe based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress Of the evolution as of May of 1992. The building blocks of a distributed architecture are discussed in relation to the choices the Mine Warfare Command has made to date. Ar...

  1. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  2. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  3. Radiolytic oxidation of propane: Computer modeling of the reaction scheme

    Science.gov (United States)

    Gupta, Avinash K.; Hanrahan, Robert J.

    The oxidation of gaseous propane under gamma radiolysis was studied at 100 torr pressure and 25°C, at oxygen pressures from 1 to 15 torr. Major oxygen-containing products and their G-values with 10% added oxygen are as follows: acetone, 0.98; i-propyl alcohol, 0.86; propionaldehyde, 0.43; n-propyl alcohol, 0.11; acrolein, 0.14; and allyl alcohol, 0.038. Minor products include i-butyl alcohol, t-amyl alcohol, n-butyl alcohol, n-amyl alcohol, and i-amyl alcohol. Small yields of i-hexyl alcohol and n-hexyl alcohol were also observed. There was no apparent difference in the G-values at pressures of 50, 100 and 150 torr. When the oxygen concentration was decreased below 5%, the yields of acetone, i-propyl alcohol, and n-propyl alcohol increased, the propionaldehyde yield decreased, and the yields of other products remained constant. The formation of major oxygen-containing products was explained on the basis that the alkyl radicals combine with molecular oxygen to give peroxyl radicals; the peroxyl radicals react with one another to give alkoxyl radicals, which in turn react with one another to form carbonyl compounds and alcohols. The reaction scheme for the formation of major products was examined using computer modeling based on a mechanism involving 28 reactions. Yields could be brought into agreement with the data within experimental error in nearly all cases.

  4. Ozone initiated reactions and human comfort in indoor environments

    DEFF Research Database (Denmark)

    Tamas, Gyöngyi

    2006-01-01

    Chemical reactions between ozone and pollutants commonly found indoors have been suggested to cause adverse health and comfort effects among building occupants. Of special interest are reactions with terpenes and other pollutants containing unsaturated carbon-carbon bonds that are fast enough...... to occur under normal conditions in various indoor settings. These reactions are known to occur both in the gas phase (homogeneous reactions) and on the surfaces of building materials (heterogeneous reactions), producing a number of compounds that can be orders of magnitude more odorous and irritating than...... their precursors. The present thesis investigates the effects of ozone-initiated reactions with limonene and with various interior surfaces, including those associated with people, on short-term sensory responses. The evaluations were conducted using a perceived air quality (PAQ) method introduced by Fanger (1988...

  5. Fostering computational thinking skills with a tangible blocks programming environment

    OpenAIRE

    Turchi, T; Malizia, A

    2016-01-01

    Computational Thinking has recently returned into the limelight as an essential skill to have for both the general public and disciplines outside Computer Science. It encapsulates those thinking skills integral to solving complex problems using a computer, thus widely applicable in our technological society. Several public initiatives such as the Hour of Code successfully introduced it to millions of people of different ages and backgrounds, mostly using Blocks Programming Environments like S...

  6. The DIII-D Computing Environment: Characteristics and Recent Changes

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1999-01-01

    The DIII-D tokamak national fusion research facility along with its predecessor Doublet III has been operating for over 21 years. The DIII-D computing environment consists of real-time systems controlling the tokamak, heating systems, and diagnostics, and systems acquiring experimental data from instrumentation; major data analysis server nodes performing short term and long term data access and data analysis; and systems providing mechanisms for remote collaboration and the dissemination of information over the world wide web. Computer systems for the facility have undergone incredible changes over the course of time as the computer industry has changed dramatically. Yet there are certain valuable characteristics of the DIII-D computing environment that have been developed over time and have been maintained to this day. Some of these characteristics include: continuous computer infrastructure improvements, distributed data and data access, computing platform integration, and remote collaborations. These characteristics are being carried forward as well as new characteristics resulting from recent changes which have included: a dedicated storage system and a hierarchical storage management system for raw shot data, various further infrastructure improvements including deployment of Fast Ethernet, the introduction of MDSplus, LSF and common IDL based tools, and improvements to remote collaboration capabilities. This paper will describe this computing environment, important characteristics that over the years have contributed to the success of DIII-D computing systems, and recent changes to computer systems

  7. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  8. Micro-computer cards for hard industrial environment

    Energy Technology Data Exchange (ETDEWEB)

    Breton, J M

    1984-03-15

    Approximately 60% of present or future distributed systems have, or will have, operational units installed in hard environments. In these applications, which include canalization and industrial motor control, robotics and process control, systems must be easily applied in environments not made for electronic use. The development of card systems in this hard industrial environment, which is found in petrochemical industry and mines is described. National semiconductor CIM card system CMOS technology allows the real time micro computer application to be efficient and functional in hard industrial environments.

  9. A computational study of pyrolysis reactions of lignin model compounds

    Science.gov (United States)

    Thomas Elder

    2010-01-01

    Enthalpies of reaction for the initial steps in the pyrolysis of lignin have been evaluated at the CBS-4m level of theory using fully substituted b-O-4 dilignols. Values for competing unimolecular decomposition reactions are consistent with results previously published for phenethyl phenyl ether models, but with lowered selectivity. Chain propagating reactions of free...

  10. Design requirements for ubiquitous computing environments for healthcare professionals.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2004-01-01

    Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.

  11. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  12. Investigation of Coal-biomass Catalytic Gasification using Experiments, Reaction Kinetics and Computational Fluid Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, Francine [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Agblevor, Foster [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Klein, Michael [Univ. of Delaware, Newark, DE (United States); Sheikhi, Reza [Northeastern Univ., Boston, MA (United States)

    2015-12-31

    A collaborative effort involving experiments, kinetic modeling, and computational fluid dynamics (CFD) was used to understand co-gasification of coal-biomass mixtures. The overall goal of the work was to determine the key reactive properties for coal-biomass mixed fuels. Sub-bituminous coal was mixed with biomass feedstocks to determine the fluidization and gasification characteristics of hybrid poplar wood, switchgrass and corn stover. It was found that corn stover and poplar wood were the best feedstocks to use with coal. The novel approach of this project was the use of a red mud catalyst to improve gasification and lower gasification temperatures. An important results was the reduction of agglomeration of the biomass using the catalyst. An outcome of this work was the characterization of the chemical kinetics and reaction mechanisms of the co-gasification fuels, and the development of a set of models that can be integrated into other modeling environments. The multiphase flow code, MFIX, was used to simulate and predict the hydrodynamics and co-gasification, and results were validated with the experiments. The reaction kinetics modeling was used to develop a smaller set of reactions for tractable CFD calculations that represented the experiments. Finally, an efficient tool was developed, MCHARS, and coupled with MFIX to efficiently simulate the complex reaction kinetics.

  13. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  14. Collaborative virtual reality environments for computational science and design

    International Nuclear Information System (INIS)

    Papka, M. E.

    1998-01-01

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10 9 atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner

  15. A Secure Authenticate Framework for Cloud Computing Environment

    OpenAIRE

    Nitin Nagar; Pradeep k. Jatav

    2014-01-01

    Cloud computing has an important aspect for the companies to build and deploy their infrastructure and application. Data Storage service in the cloud computing is easy as compare to the other data storage services. At the same time, cloud security in the cloud environment is challenging task. Security issues ranging from missing system configuration, lack of proper updates, or unwise user actions from remote data storage. It can expose user’s private data and information to unwanted access. i...

  16. Chemistry in interstellar space. [environment characteristics influencing reaction dynamics

    Science.gov (United States)

    Donn, B.

    1973-01-01

    The particular characteristics of chemistry in interstellar space are determined by the unique environmental conditions involved. Interstellar matter is present at extremely low densities. Large deviations from thermodynamic equilibrium are, therefore, to be expected. A relatively intense ultraviolet radiation is present in many regions. The temperatures are in the range from 5 to 200 K. Data concerning the inhibiting effect of small activation energies in interstellar clouds are presented in a table. A summary of measured activation energies or barrier heights for exothermic exchange reactions is also provided. Problems of molecule formation are discussed, taking into account gas phase reactions and surface catalyzed processes.

  17. Computational Analyses of Complex Flows with Chemical Reactions

    Science.gov (United States)

    Bae, Kang-Sik

    The heat and mass transfer phenomena in micro-scale for the mass transfer phenomena on drug in cylindrical matrix system, the simulation of oxygen/drug diffusion in a three dimensional capillary network, and a reduced chemical kinetic modeling of gas turbine combustion for Jet propellant-10 have been studied numerically. For the numerical analysis of the mass transfer phenomena on drug in cylindrical matrix system, the governing equations are derived from the cylindrical matrix systems, Krogh cylinder model, which modeling system is comprised of a capillary to a surrounding cylinder tissue along with the arterial distance to veins. ADI (Alternative Direction Implicit) scheme and Thomas algorithm are applied to solve the nonlinear partial differential equations (PDEs). This study shows that the important factors which have an effect on the drug penetration depth to the tissue are the mass diffusivity and the consumption of relevant species during the time allowed for diffusion to the brain tissue. Also, a computational fluid dynamics (CFD) model has been developed to simulate the blood flow and oxygen/drug diffusion in a three dimensional capillary network, which are satisfied in the physiological range of a typical capillary. A three dimensional geometry has been constructed to replicate the one studied by Secomb et al. (2000), and the computational framework features a non-Newtonian viscosity model for blood, the oxygen transport model including in oxygen-hemoglobin dissociation and wall flux due to tissue absorption, as well as an ability to study the diffusion of drugs and other materials in the capillary streams. Finally, a chemical kinetic mechanism of JP-10 has been compiled and validated for a wide range of combustion regimes, covering pressures of 1atm to 40atm with temperature ranges of 1,200 K--1,700 K, which is being studied as a possible Jet propellant for the Pulse Detonation Engine (PDE) and other high-speed flight applications such as hypersonic

  18. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  19. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  20. Visual Reasoning in Computational Environment: A Case of Graph Sketching

    Science.gov (United States)

    Leung, Allen; Chan, King Wah

    2004-01-01

    This paper reports the case of a form six (grade 12) Hong Kong student's exploration of graph sketching in a computational environment. In particular, the student summarized his discovery in the form of two empirical laws. The student was interviewed and the interviewed data were used to map out a possible path of his visual reasoning. Critical…

  1. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  2. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  3. The Computer Revolution in Science: Steps towards the realization of computer-supported discovery environments

    NARCIS (Netherlands)

    de Jong, Hidde; Rip, Arie

    1997-01-01

    The tools that scientists use in their search processes together form so-called discovery environments. The promise of artificial intelligence and other branches of computer science is to radically transform conventional discovery environments by equipping scientists with a range of powerful

  4. Modeling chemical reactions in the indoor environment by CFD

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Weschler, Charles J.

    2002-01-01

    The concentrations of ozone and a terpene that react in the gas-phase to produce a hypothetical product were investigated by computational fluid dynamics (CFD) for two different air exchange rates. Ozone entered the room with the ventilation air. The terpenes were introduced as a localized source...

  5. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  6. Complex Reaction Environments and Competing Reaction Mechanisms in Zeolite Catalysis: Insights from Advanced Molecular Dynamics

    NARCIS (Netherlands)

    De Wispelaere, K.; Ensing, B.; Ghysels, A.; Meijer, E.J.; van Van Speybroeck, V.

    2015-01-01

    The methanol-to-olefin process is a showcase example of complex zeolite-catalyzed chemistry. At real operating conditions, many factors affect the reactivity, such as framework flexibility, adsorption of various guest molecules, and competitive reaction pathways. In this study, the strength of first

  7. Operational computer graphics in the flight dynamics environment

    Science.gov (United States)

    Jeletic, James F.

    1989-01-01

    Over the past five years, the Flight Dynamics Division of the National Aeronautics and Space Administration's (NASA's) Goddard Space Flight Center has incorporated computer graphics technology into its operational environment. In an attempt to increase the effectiveness and productivity of the Division, computer graphics software systems have been developed that display spacecraft tracking and telemetry data in 2-d and 3-d graphic formats that are more comprehensible than the alphanumeric tables of the past. These systems vary in functionality from real-time mission monitoring system, to mission planning utilities, to system development tools. Here, the capabilities and architecture of these systems are discussed.

  8. Computation of 3D form factors in complex environments

    International Nuclear Information System (INIS)

    Coulon, N.

    1989-01-01

    The calculation of radiant interchange among opaque surfaces in a complex environment poses the general problem of determining the visible and hidden parts of the environment. In many thermal engineering applications, surfaces are separated by radiatively non-participating media and may be idealized as diffuse emitters and reflectors. Consenquently the net radiant energy fluxes are intimately related to purely geometrical quantities called form factors, that take into account hidden parts: the problem is reduced to the form factor evaluation. This paper presents the method developed for the computation of 3D form factors in the finite-element module of the system TRIO, which is a general computer code for thermal and fluid flow analysis. The method is derived from an algorithm devised for synthetic image generation. A comparison is performed with the standard contour integration method also implemented and suited to convex geometries. Several illustrative examples of finite-element thermal calculations in radiating enclosures are given

  9. A Novel Biometric Approach for Authentication In Pervasive Computing Environments

    OpenAIRE

    Rachappa,; Divyajyothi M G; D H Rao

    2016-01-01

    The paradigm of embedding computing devices in our surrounding environment has gained more interest in recent days. Along with contemporary technology comes challenges, the most important being the security and privacy aspect. Keeping the aspect of compactness and memory constraints of pervasive devices in mind, the biometric techniques proposed for identification should be robust and dynamic. In this work, we propose an emerging scheme that is based on few exclusive human traits and characte...

  10. Quality control of computational fluid dynamics in indoor environments

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Nielsen, P. V.

    2003-01-01

    Computational fluid dynamics (CFD) is used routinely to predict air movement and distributions of temperature and concentrations in indoor environments. Modelling and numerical errors are inherent in such studies and must be considered when the results are presented. Here, we discuss modelling as...... the quality of CFD calculations, as well as guidelines for the minimum information that should accompany all CFD-related publications to enable a scientific judgment of the quality of the study....

  11. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  12. The reaction of slag in cement, theory and computer modelling

    NARCIS (Netherlands)

    Chen, Wei; Brouwers, H.J.H.; Fischer, H.B

    2006-01-01

    For a better understanding of the performance of slag in concrete, evaluating the feasibility of using one certain type of slag and possible improvement of its use in practice, fundamental knowledge about its reaction and interaction with other constituents is important. While the researches on

  13. Illustration of reaction mechanism in polyatomic systems via computer movies

    International Nuclear Information System (INIS)

    Raff, L.M.

    1974-01-01

    The CD 4 + T* systems is suited for classroom illustration of reaction dynamics. Questions about the system can be illustrated by reducing selected many-body trajectories to a 16 mm color movie that represents the six-body motion in projected coordinates. Such a movie has been produced for this system. The production procedure used is reported, and a detailed description of the contents of the movie is given. (U.S.)

  14. Reaction diffusion voronoi diagrams: from sensors data to computing

    Czech Academy of Sciences Publication Activity Database

    Vázquez-Otero, Alejandro (ed.); Faigl, J.; Dormido, R.; Duro, N.

    2015-01-01

    Roč. 15, č. 6 (2015), s. 12736-12764 ISSN 1424-8220 R&D Projects: GA MŠk ED1.1.00/02.0061 Grant - others:ELI Beamlines(XE) CZ.1.05/1.1.00/02.0061 Institutional support: RVO:68378271 Keywords : reaction diffusion * FitzHugh–Nagumo * path planning * navigation * exploration Subject RIV: BD - Theory of Information Impact factor: 2.033, year: 2015

  15. An Introduction to Computer Forensics: Gathering Evidence in a Computing Environment

    Directory of Open Access Journals (Sweden)

    Henry B. Wolfe

    2001-01-01

    Full Text Available Business has become increasingly dependent on the Internet and computing to operate. It has become apparent that there are issues of evidence gathering in a computing environment, which by their nature are technical and different to other forms of evidence gathering, that must be addressed. This paper offers an introduction to some of the technical issues surrounding this new and specialized field of Computer Forensics. It attempts to identify and describe sources of evidence that can be found on disk data storage devices in the course of an investigation. It also considers sources of copies of email, which can be used in evidence, as well as case building.

  16. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  17. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  18. Acoustic radiosity for computation of sound fields in diffuse environments

    Science.gov (United States)

    Muehleisen, Ralph T.; Beamer, C. Walter

    2002-05-01

    The use of image and ray tracing methods (and variations thereof) for the computation of sound fields in rooms is relatively well developed. In their regime of validity, both methods work well for prediction in rooms with small amounts of diffraction and mostly specular reflection at the walls. While extensions to the method to include diffuse reflections and diffraction have been made, they are limited at best. In the fields of illumination and computer graphics the ray tracing and image methods are joined by another method called luminous radiative transfer or radiosity. In radiosity, an energy balance between surfaces is computed assuming diffuse reflection at the reflective surfaces. Because the interaction between surfaces is constant, much of the computation required for sound field prediction with multiple or moving source and receiver positions can be reduced. In acoustics the radiosity method has had little attention because of the problems of diffraction and specular reflection. The utility of radiosity in acoustics and an approach to a useful development of the method for acoustics will be presented. The method looks especially useful for sound level prediction in industrial and office environments. [Work supported by NSF.

  19. Glider-based computing in reaction-diffusion hexagonal cellular automata

    International Nuclear Information System (INIS)

    Adamatzky, Andrew; Wuensche, Andrew; De Lacy Costello, Benjamin

    2006-01-01

    A three-state hexagonal cellular automaton, discovered in [Wuensche A. Glider dynamics in 3-value hexagonal cellular automata: the beehive rule. Int J Unconvention Comput, in press], presents a conceptual discrete model of a reaction-diffusion system with inhibitor and activator reagents. The automaton model of reaction-diffusion exhibits mobile localized patterns (gliders) in its space-time dynamics. We show how to implement the basic computational operations with these mobile localizations, and thus demonstrate collision-based logical universality of the hexagonal reaction-diffusion cellular automaton

  20. Preserving access to ALEPH computing environment via virtual machines

    International Nuclear Information System (INIS)

    Coscetti, Simone; Boccali, Tommaso; Arezzini, Silvia; Maggi, Marcello

    2014-01-01

    The ALEPH Collaboration [1] took data at the LEP (CERN) electron-positron collider in the period 1989-2000, producing more than 300 scientific papers. While most of the Collaboration activities stopped in the last years, the data collected still has physics potential, with new theoretical models emerging, which ask checks with data at the Z and WW production energies. An attempt to revive and preserve the ALEPH Computing Environment is presented; the aim is not only the preservation of the data files (usually called bit preservation), but of the full environment a physicist would need to perform brand new analyses. Technically, a Virtual Machine approach has been chosen, using the VirtualBox platform. Concerning simulated events, the full chain from event generators to physics plots is possible, and reprocessing of data events is also functioning. Interactive tools like the DALI event display can be used on both data and simulated events. The Virtual Machine approach is suited for both interactive usage, and for massive computing using Cloud like approaches.

  1. Computed Potential Energy Surfaces and Minimum Energy Pathway for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such observables as rate constants as a function of temperature, product branching ratios, and other detailed properties. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method with the Dunning correlation consistent basis sets to obtain accurate energetics, gives useful results for a number of chemically important systems. Applications to complex reactions leading to NO and soot formation in hydrocarbon combustion are discussed.

  2. Enabling Computational Dynamics in Distributed Computing Environments Using a Heterogeneous Computing Template

    Science.gov (United States)

    2011-08-09

    heterogeneous computing concept advertised recently as the paradigm capable of delivering exascale flop rates by the end of the decade. In this framework...and Lamb. Page 10 of 10 UNCLASSIFIED [3] Skaugen, K., Petascale to Exascale : Extending Intel’s HPC Commitment: http://download.intel.com

  3. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  4. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  5. Analysis of reaction cross-section production in neutron induced fission reactions on uranium isotope using computer code COMPLET.

    Science.gov (United States)

    Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso

    2018-04-22

    This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Individual Differences in Behavioural Reaction to a Changing Environment in Mice and Rats

    NARCIS (Netherlands)

    Benus, R.F.; Koolhaas, J.M.; Oortmerssen, G.A. van

    1987-01-01

    Aggressive and non-aggressive male mice differ in their reaction to a changing social environment. In order to investigate if this differentiation holds also for non-social situations male mice are trained in a standard maze task, whereafter a change (extramaze and intramaze, respectively) is

  7. FAST - A multiprocessed environment for visualization of computational fluid dynamics

    International Nuclear Information System (INIS)

    Bancroft, G.V.; Merritt, F.J.; Plessel, T.C.; Kelaita, P.G.; Mccabe, R.K.

    1991-01-01

    The paper presents the Flow Analysis Software Toolset (FAST) to be used for fluid-mechanics analysis. The design criteria for FAST including the minimization of the data path in the computational fluid-dynamics (CFD) process, consistent user interface, extensible software architecture, modularization, and the isolation of three-dimensional tasks from the application programmer are outlined. Each separate process communicates through the FAST Hub, while other modules such as FAST Central, NAS file input, CFD calculator, surface extractor and renderer, titler, tracer, and isolev might work together to generate the scene. An interprocess communication package making it possible for FAST to operate as a modular environment where resources could be shared among different machines as well as a single host is discussed. 20 refs

  8. Towards reaction-diffusion computing devices based on minority-carrier transport in semiconductors

    International Nuclear Information System (INIS)

    Asai, Tetsuya; Adamatzky, Andrew; Amemiya, Yoshihito

    2004-01-01

    Reaction-diffusion (RD) chemical systems are known to realize sensible computation when both data and results of the computation are encoded in concentration profiles of chemical species; the computation is implemented via spreading and interaction of either diffusive or phase waves. Thin-layer chemical systems are thought of therefore as massively-parallel locally-connected computing devices, where micro-volume of the medium is analogous to an elementary processor. Practical applications of the RD chemical systems are reduced however due to very low speed of traveling waves which makes real-time computation senseless. To overcome the speed-limitations while preserving unique features of RD computers we propose a semiconductor RD computing device where minority carriers diffuse as chemical species and reaction elements are represented by p-n-p-n diodes. We offer blue-prints of the RD semiconductor devices, and study in computer simulation propagation phenomena of the density wave of minority carriers. We then demonstrate what computational problems can be solved in RD semiconductor devices and evaluate space-time complexity of computation in the devices

  9. Student Advising and Retention Application in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2016-11-01

    Full Text Available  This paper proposes a new user-friendly application enhancing and expanding the current advising services of Gradesfirst currently being used for advising and retention by the Athletic department of UMES with a view to implement new performance activities like mentoring, tutoring, scheduling, and study hall hours into existing tools. This application includes various measurements that can be used to monitor and improve the performance of the students in the Athletic Department of UMES by monitoring students’ weekly study hall hours, and tutoring schedules. It also supervises tutors’ login and logout activities in order to monitor their effectiveness, supervises tutor-tutee interaction, and stores and analyzes the overall academic progress of each student. A dedicated server for providing services will be developed at the local site. The paper has been implemented in three steps. The first step involves the creation of an independent cloud computing environment that provides resources such as database creation, query-based statistical data, performance measures activities, and automated support of performance measures such as advising, mentoring, monitoring and tutoring. The second step involves the creation of an application known as Student Advising and Retention (SAR application in a cloud computing environment. This application has been designed to be a comprehensive database management system which contains relevant data regarding student academic development that supports various strategic advising and monitoring of students. The third step involves the creation of a systematic advising chart and frameworks which help advisors. The paper shows ways of creating the most appropriate advising technique based on the student’s academic needs. The proposed application runs in a Windows-based system. As stated above, the proposed application is expected to enhance and expand the current advising service of Gradesfirst tool. A brief

  10. Effects of alpha and gamma radiation on glass reaction in an unsaturated environment

    International Nuclear Information System (INIS)

    Wronkiewicz, D.J.; Young, J.E.; Bates, J.K.

    1990-01-01

    Radiation may effect the long-term performance of glass in an unsaturated repository site by interacting with air, water vapor, or liquid water. The present study examines (1) the effects of alpha or gamma irradiation in a water vapor environment, and (2) the influence of radiolytic products on glass reaction. Results indicate that nitric and organic acids form in an irradiated water vapor environment and are dissolved in thin films of condensed water. Glass samples exposed to these conditions react faster and have a different assemblage of secondary phases than glasses exposed to nonirradiated water vapor environments. 23 refs., 4 figs., 2 tabs

  11. Computational organic chemistry: bridging theory and experiment in establishing the mechanisms of chemical reactions.

    Science.gov (United States)

    Cheng, Gui-Juan; Zhang, Xinhao; Chung, Lung Wa; Xu, Liping; Wu, Yun-Dong

    2015-02-11

    Understanding the mechanisms of chemical reactions, especially catalysis, has been an important and active area of computational organic chemistry, and close collaborations between experimentalists and theorists represent a growing trend. This Perspective provides examples of such productive collaborations. The understanding of various reaction mechanisms and the insight gained from these studies are emphasized. The applications of various experimental techniques in elucidation of reaction details as well as the development of various computational techniques to meet the demand of emerging synthetic methods, e.g., C-H activation, organocatalysis, and single electron transfer, are presented along with some conventional developments of mechanistic aspects. Examples of applications are selected to demonstrate the advantages and limitations of these techniques. Some challenges in the mechanistic studies and predictions of reactions are also analyzed.

  12. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  13. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  14. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  15. Learning styles: individualizing computer-based learning environments

    Directory of Open Access Journals (Sweden)

    Tim Musson

    1995-12-01

    Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.

  16. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  17. Environment Modules on the Peregrine System | High-Performance Computing |

    Science.gov (United States)

    NREL Environment Modules on the Peregrine System Environment Modules on the Peregrine System Peregrine uses environment modules to easily manage software environments. Environment modules facilitate modules commands set up a basic environment for the default compilers, tools and libraries, such as the

  18. Computed Potential Energy Surfaces and Minimum Energy Pathways for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such parameters as rate constants as a function of temperature, product branching ratios, and other detailed properties. For some dynamics methods, global potential energy surfaces are required. In this case, it is necessary to obtain the energy at a complete sampling of all the possible arrangements of the nuclei, which are energetically accessible, and then a fitting function must be obtained to interpolate between the computed points. In other cases, characterization of the stationary points and the reaction pathway connecting them is sufficient. These properties may be readily obtained using analytical derivative methods. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method to obtain accurate energetics, gives usefull results for a number of chemically important systems. The talk will focus on a number of applications including global potential energy surfaces, H + O2, H + N2, O(3p) + H2, and reaction pathways for complex reactions, including reactions leading to NO and soot formation in hydrocarbon combustion.

  19. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Hamm, R.N.; Waker, A.J.; Prestwich, W.V.

    1988-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3x10 9 dm 3 mol -1 s-1. Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (author)

  20. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Waker, A.J.; Prestwich, W.V.

    1998-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in a cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3 x 10 9 dm 3 mol -1 s -1 . Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (orig.)

  1. A prospective survey of delayed adverse reactions to iohexol in urography and computed tomography

    International Nuclear Information System (INIS)

    Munechika, Hirotsugu; Hiramatsu, Yoshihiro; Kudo, Sho; Sugimura, Kazuro; Hamada, Chikuma; Yamaguchi, Koichi; Katayama, Hitoshi

    2003-01-01

    We investigated 7505 inpatients who underwent intravenous urography or contrast-enhanced computed tomography to assess risk factors for delayed adverse drug reactions to iohexol, a non-ionic iodinated contrast medium. Focusing on delayed adverse reactions, all adverse events were prospectively investigated for 7 days after injection of iohexol. To explore the relevant risk factors, the relationship between occurrence of adverse reactions to iohexol and 17 different variables was evaluated by logistic regression analysis. To assess the influence of seasonal factors, adverse reactions were separately evaluated during two periods: February to April (the pollinosis period in Japan) and July to September (the non-pollinosis period). The prevalence of delayed adverse events and delayed adverse reactions was 3.5 and 2.8%, respectively, whereas the prevalence of adverse events and adverse reactions was 5.7 and 5.0%, respectively. Multivariate analysis showed that six parameters had a significant influence on delayed adverse reactions to iohexol, including (a) a history of allergy, (b) season, (c) radiographic procedure, (d) age, (e) concomitant surgery or other invasive procedures, and (f) concomitant medication. The prevalence of delayed reactions was lower than in previous large-scale studies. Significant risk factors included a history of allergy and performance of radiography during the pollinosis period, suggesting that allergy was involved in delayed adverse reactions. The type of radiographic procedure also had an influence. (orig.)

  2. Determining the reaction in kinematic pairs of certain mechanisms using a digital computer

    Energy Technology Data Exchange (ETDEWEB)

    Chifchieva, V N

    1980-01-01

    In Dorr classifiers, walking excavators, conveyors, sieves and other mechanisms, one finds a triad with a sliding pair. An algorithm is proposed for determining reactions in the kinematic connections of a triad with one, two or three sliding pairs. The algorithm is suitable for use in digital computers. It is based on the transfer function method, and has several advantages over the technnique of determining reactions in kinematic pairs of V. Zinovyev. A concrete example is given of calculating reactions in the connections of a crank and lever mechanism of a walking excavator.

  3. Computational and Experimental Study of Thermodynamics of the Reaction of Titania and Water at High Temperatures.

    Science.gov (United States)

    Nguyen, Q N; Bauschlicher, C W; Myers, D L; Jacobson, N S; Opila, E J

    2017-12-14

    Gaseous titanium hydroxide and oxyhydroxide species were studied with quantum chemical methods. The results are used in conjunction with an experimental transpiration study of titanium dioxide (TiO 2 ) in water vapor-containing environments at elevated temperatures to provide a thermodynamic description of the Ti(OH) 4 (g) and TiO(OH) 2 (g) species. The geometry and harmonic vibrational frequencies of these species were computed using the coupled-cluster singles and doubles method with a perturbative correction for connected triple substitutions [CCSD(T)]. For the OH bending and rotation, the B3LYP density functional theory was used to compute corrections to the harmonic approximations. These results were combined to determine the enthalpy of formation. Experimentally, the transpiration method was used with water contents from 0 to 76 mol % in oxygen or argon carrier gases for 20-250 h exposure times at 1473-1673 K. Results indicate that oxygen is not a key contributor to volatilization, and the primary reaction for volatilization in this temperature range is TiO 2 (s) + H 2 O(g) = TiO(OH) 2 (g). Data were analyzed with both the second and third law methods using the thermal functions derived from the theoretical calculations. The third law enthalpy of formation at 298.15 K for TiO(OH) 2 (g) at 298 K was -838.9 ± 6.5 kJ/mol, which compares favorably to the theoretical calculation of -838.7 ± 25 kJ/mol. We recommend the experimentally derived third law enthalpy of formation at 298.15 K for TiO(OH) 2 , the computed entropy of 320.67 J/mol·K, and the computed heat capacity [149.192 + (-0.02539)T + (8.28697 × 10 -6 )T 2 + (-15614.05)/T + (-5.2182 × 10 -11 )/T 2 ] J/mol-K, where T is the temperature in K.

  4. Controlling Chemical Reactions in Confined Environments: Water Dissociation in MOF-74

    Directory of Open Access Journals (Sweden)

    Erika M. A. Fuentes-Fernandez

    2018-02-01

    Full Text Available The confined porous environment of metal organic frameworks (MOFs is an attractive system for studying reaction mechanisms. Compared to flat oxide surfaces, MOFs have the key advantage that they exhibit a well-defined structure and present significantly fewer challenges in experimental characterization. As an example of an important reaction, we study here the dissociation of water—which plays a critical role in biology, chemistry, and materials science—in MOFs and show how the knowledge of the structure in this confined environment allows for an unprecedented level of understanding and control. In particular, combining in-situ infrared spectroscopy and first-principles calculations, we show that the water dissociation reaction can be selectively controlled inside Zn-MOF-74 by alcohol, through both chemical and physical interactions. Methanol is observed to speed up water dissociation by 25% to 100%, depending on the alcohol partial pressure. On the other hand, co-adsorption of isopropanol reduces the speed of the water reaction, due mostly to steric interactions. In addition, we also investigate the stability of the product state after the water dissociation has occurred and find that the presence of additional water significantly stabilizes the dissociated state. Our results show that precise control of reactions within nano-porous materials is possible, opening the way for advances in fields ranging from catalysis to electrochemistry and sensors.

  5. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  6. Efficient Computation of Transition State Resonances and Reaction Rates from a Quantum Normal Form

    NARCIS (Netherlands)

    Schubert, Roman; Waalkens, Holger; Wiggins, Stephen

    2006-01-01

    A quantum version of a recent formulation of transition state theory in phase space is presented. The theory developed provides an algorithm to compute quantum reaction rates and the associated Gamov-Siegert resonances with very high accuracy. The algorithm is especially efficient for

  7. Equilibrium chemical reaction of supersonic hydrogen-air jets (the ALMA computer program)

    Science.gov (United States)

    Elghobashi, S.

    1977-01-01

    The ALMA (axi-symmetrical lateral momentum analyzer) program is concerned with the computation of two dimensional coaxial jets with large lateral pressure gradients. The jets may be free or confined, laminar or turbulent, reacting or non-reacting. Reaction chemistry is equilibrium.

  8. Security issues of cloud computing environment in possible military applications

    OpenAIRE

    Samčović, Andreja B.

    2013-01-01

    The evolution of cloud computing over the past few years is potentially one of major advances in the history of computing and telecommunications. Although there are many benefits of adopting cloud computing, there are also some significant barriers to adoption, security issues being the most important of them. This paper introduces the concept of cloud computing; looks at relevant technologies in cloud computing; takes into account cloud deployment models and some military applications. Addit...

  9. Implementing interactive computing in an object-oriented environment

    Directory of Open Access Journals (Sweden)

    Frederic Udina

    2000-04-01

    Full Text Available Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control of computational flow to ensure that only strictly required computations are actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.

  10. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  11. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  12. Extension of a Kinetic-Theory Approach for Computing Chemical-Reaction Rates to Reactions with Charged Particles

    Science.gov (United States)

    Liechty, Derek S.; Lewis, Mark J.

    2010-01-01

    Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction rate information) are extended to include reactions involving charged particles and electronic energy levels. The proposed extensions include ionization reactions, exothermic associative ionization reactions, endothermic and exothermic charge exchange reactions, and other exchange reactions involving ionized species. The extensions are shown to agree favorably with the measured Arrhenius rates for near-equilibrium conditions.

  13. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    Science.gov (United States)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  14. Earth observation scientific workflows in a distributed computing environment

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2011-09-01

    Full Text Available capabilities has focused on the web services approach as exemplified by the OGC's Web Processing Service and by GRID computing. The approach to leveraging distributed computing resources described in this paper uses instead remote objects via RPy...

  15. A high performance scientific cloud computing environment for materials simulations

    OpenAIRE

    Jorissen, Kevin; Vila, Fernando D.; Rehr, John J.

    2011-01-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including...

  16. Securing the Data Storage and Processing in Cloud Computing Environment

    Science.gov (United States)

    Owens, Rodney

    2013-01-01

    Organizations increasingly utilize cloud computing architectures to reduce costs and energy consumption both in the data warehouse and on mobile devices by better utilizing the computing resources available. However, the security and privacy issues with publicly available cloud computing infrastructures have not been studied to a sufficient depth…

  17. Development of tight-binding, chemical-reaction-dynamics simulator for combinatorial computational chemistry

    International Nuclear Information System (INIS)

    Kubo, Momoji; Ando, Minako; Sakahara, Satoshi; Jung, Changho; Seki, Kotaro; Kusagaya, Tomonori; Endou, Akira; Takami, Seiichi; Imamura, Akira; Miyamoto, Akira

    2004-01-01

    Recently, we have proposed a new concept called 'combinatorial computational chemistry' to realize a theoretical, high-throughput screening of catalysts and materials. We have already applied our combinatorial, computational-chemistry approach, mainly based on static first-principles calculations, to various catalysts and materials systems and its applicability to the catalysts and materials design was strongly confirmed. In order to realize more effective and efficient combinatorial, computational-chemistry screening, a high-speed, chemical-reaction-dynamics simulator based on quantum-chemical, molecular-dynamics method is essential. However, to the best of our knowledge, there is no chemical-reaction-dynamics simulator, which has an enough high-speed ability to perform a high-throughput screening. In the present study, we have succeeded in the development of a chemical-reaction-dynamics simulator based on our original, tight-binding, quantum-chemical, molecular-dynamics method, which is more than 5000 times faster than the regular first-principles, molecular-dynamics method. Moreover, its applicability and effectiveness to the atomistic clarification of the methanol-synthesis dynamics at reaction temperature were demonstrated

  18. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  19. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  20. Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments.

    NARCIS (Netherlands)

    Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick

    2008-01-01

    Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.

  1. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  2. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    Science.gov (United States)

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  3. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  4. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  5. The Integrated Computational Environment for Airbreathing Hypersonic Flight Vehicle Modeling and Design Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An integrated computational environment for multidisciplinary, physics-based simulation and analyses of airbreathing hypersonic flight vehicles will be developed....

  6. Computation Offloading Algorithm for Arbitrarily Divisible Applications in Mobile Edge Computing Environments: An OCR Case

    Directory of Open Access Journals (Sweden)

    Bo Li

    2018-05-01

    Full Text Available Divisible applications are a class of tasks whose loads can be partitioned into some smaller fractions, and each part can be executed independently by a processor. A wide variety of divisible applications have been found in the area of parallel and distributed processing. This paper addresses the problem of how to partition and allocate divisible applications to available resources in mobile edge computing environments with the aim of minimizing the completion time of the applications. A theoretical model was proposed for partitioning an entire divisible application according to the load of the application and the capabilities of available resources, and the solutions were derived in closed form. Both simulations and real experiments were carried out to justify this model.

  7. SOCON: a computer model for analyzing the behavior of sodium-concrete reactions

    International Nuclear Information System (INIS)

    Nguyen, D.G.; Muhlestein, L.D.

    1985-03-01

    Guided by experimental evidence available to date, ranging from basic laboratory studies to large scale tests, a mechanistic computer model (the SOCON model) has been developed to analyze the behavior of SOdium-CONcrete reactions. The model accounts for the thermal, chemical and mechanical phenomena which interact to determine the consequences of the reactions. Reaction limiting mechanisms could be any process which reduces water release and sodium transport to fresh concrete; the buildup of the inert reaction product layer would increase the resistance to sodium transport; water dry-out would decrease the bubble agitation transport mechanism. However, stress-induced failure of concrete, such as spalling, crushing and cracking, and a massive release of gaseous products (hydrogen, water vapor and CO 2 ) would increase the transport of sodium to the reaction zone. The results of SOCON calculations are in excellent agreement with measurements obtained from large-scale sodium-limestone concrete reaction tests of duration up to 100 hours conducted at the Hanford Engineering Development Laboratory. 8 refs., 7 figs

  8. Cloud Computing as Network Environment in Students Work

    OpenAIRE

    Piotrowski, Dominik Mirosław

    2013-01-01

    The purpose of the article was to show the need for literacy education from a variety of services available in the cloud computing as a specialist information field of activity. Teaching at university in the field of cloud computing related to the management of information could provide tangible benefits in the form of useful learning outcomes. This allows students and future information professionals to begin enjoying the benefits of cloud computing SaaS model at work, thereby freeing up of...

  9. Computing the Free Energy along a Reaction Coordinate Using Rigid Body Dynamics.

    Science.gov (United States)

    Tao, Peng; Sodt, Alexander J; Shao, Yihan; König, Gerhard; Brooks, Bernard R

    2014-10-14

    The calculations of potential of mean force along complex chemical reactions or rare events pathways are of great interest because of their importance for many areas in chemistry, molecular biology, and material science. The major difficulty for free energy calculations comes from the great computational cost for adequate sampling of the system in high-energy regions, especially close to the reaction transition state. Here, we present a method, called FEG-RBD, in which the free energy gradients were obtained from rigid body dynamics simulations. Then the free energy gradients were integrated along a reference reaction pathway to calculate free energy profiles. In a given system, the reaction coordinates defining a subset of atoms (e.g., a solute, or the quantum mechanics (QM) region of a quantum mechanics/molecular mechanics simulation) are selected to form a rigid body during the simulation. The first-order derivatives (gradients) of the free energy with respect to the reaction coordinates are obtained through the integration of constraint forces within the rigid body. Each structure along the reference reaction path is separately subjected to such a rigid body simulation. The individual free energy gradients are integrated along the reference pathway to obtain the free energy profile. Test cases provided demonstrate both the strengths and weaknesses of the FEG-RBD method. The most significant benefit of this method comes from the fast convergence rate of the free energy gradient using rigid-body constraints instead of restraints. A correction to the free energy due to approximate relaxation of the rigid-body constraint is estimated and discussed. A comparison with umbrella sampling using a simple test case revealed the improved sampling efficiency of FEG-RBD by a factor of 4 on average. The enhanced efficiency makes this method effective for calculating the free energy of complex chemical reactions when the reaction coordinate can be unambiguously defined by a

  10. Synthesis of antimicrobial silver nanoparticles through a photomediated reaction in an aqueous environment.

    Science.gov (United States)

    Banasiuk, Rafał; Frackowiak, Joanna E; Krychowiak, Marta; Matuszewska, Marta; Kawiak, Anna; Ziabka, Magdalena; Lendzion-Bielun, Zofia; Narajczyk, Magdalena; Krolicka, Aleksandra

    2016-01-01

    A fast, economical, and reproducible method for nanoparticle synthesis has been developed in our laboratory. The reaction is performed in an aqueous environment and utilizes light emitted by commercially available 1 W light-emitting diodes (λ =420 nm) as the catalyst. This method does not require nanoparticle seeds or toxic chemicals. The irradiation process is carried out for a period of up to 10 minutes, significantly reducing the time required for synthesis as well as environmental impact. By modulating various reaction parameters silver nanoparticles were obtained, which were predominantly either spherical or cubic. The produced nanoparticles demonstrated strong antimicrobial activity toward the examined bacterial strains. Additionally, testing the effect of silver nanoparticles on the human keratinocyte cell line and human peripheral blood mononuclear cells revealed that their cytotoxicity may be limited by modulating the employed concentrations of nanoparticles.

  11. Material interactions with the Low Earth Orbital (LEO) environment: Accurate reaction rate measurements

    Science.gov (United States)

    Visentine, James T.; Leger, Lubert J.

    1987-01-01

    To resolve uncertainties in estimated LEO atomic oxygen fluence and provide reaction product composition data for comparison to data obtained in ground-based simulation laboratories, a flight experiment has been proposed for the space shuttle which utilizes an ion-neutral mass spectrometer to obtain in-situ ambient density measurements and identify reaction products from modeled polymers exposed to the atomic oxygen environment. An overview of this experiment is presented and the methodology of calibrating the flight mass spectrometer in a neutral beam facility prior to its use on the space shuttle is established. The experiment, designated EOIM-3 (Evaluation of Oxygen Interactions with Materials, third series), will provide a reliable materials interaction data base for future spacecraft design and will furnish insight into the basic chemical mechanisms leading to atomic oxygen interactions with surfaces.

  12. The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system

    Science.gov (United States)

    Zerkin, V. V.; Pritychenko, B.

    2018-04-01

    The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.

  13. Computing multi-species chemical equilibrium with an algorithm based on the reaction extents

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2013-01-01

    -negative constrains. The residual function, representing the distance to the equilibrium, is defined from the chemical potential (or Gibbs energy) of the chemical system. Local minimums are potentially avoided by the prioritization of the aqueous reactions with respect to the heterogeneous reactions. The formation......A mathematical model for the solution of a set of chemical equilibrium equations in a multi-species and multiphase chemical system is described. The computer-aid solution of model is achieved by means of a Newton-Raphson method enhanced with a line-search scheme, which deals with the non...... and release of gas bubbles is taken into account in the model, limiting the concentration of volatile aqueous species to a maximum value, given by the gas solubility constant.The reaction extents are used as state variables for the numerical method. As a result, the accepted solution satisfies the charge...

  14. A computational approach to extinction events in chemical reaction networks with discrete state spaces.

    Science.gov (United States)

    Johnston, Matthew D

    2017-12-01

    Recent work of Johnston et al. has produced sufficient conditions on the structure of a chemical reaction network which guarantee that the corresponding discrete state space system exhibits an extinction event. The conditions consist of a series of systems of equalities and inequalities on the edges of a modified reaction network called a domination-expanded reaction network. In this paper, we present a computational implementation of these conditions written in Python and apply the program on examples drawn from the biochemical literature. We also run the program on 458 models from the European Bioinformatics Institute's BioModels Database and report our results. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. General method and thermodynamic tables for computation of equilibrium composition and temperature of chemical reactions

    Science.gov (United States)

    Huff, Vearl N; Gordon, Sanford; Morrell, Virginia E

    1951-01-01

    A rapidly convergent successive approximation process is described that simultaneously determines both composition and temperature resulting from a chemical reaction. This method is suitable for use with any set of reactants over the complete range of mixture ratios as long as the products of reaction are ideal gases. An approximate treatment of limited amounts of liquids and solids is also included. This method is particularly suited to problems having a large number of products of reaction and to problems that require determination of such properties as specific heat or velocity of sound of a dissociating mixture. The method presented is applicable to a wide variety of problems that include (1) combustion at constant pressure or volume; and (2) isentropic expansion to an assigned pressure, temperature, or Mach number. Tables of thermodynamic functions needed with this method are included for 42 substances for convenience in numerical computations.

  16. Genotype by environment interaction for 450-day weight of Nelore cattle analyzed by reaction norm models

    Directory of Open Access Journals (Sweden)

    Newton T. Pégolo

    2009-01-01

    Full Text Available Genotype by environment interactions (GEI have attracted increasing attention in tropical breeding programs because of the variety of production systems involved. In this work, we assessed GEI in 450-day adjusted weight (W450 Nelore cattle from 366 Brazilian herds by comparing traditional univariate single-environment model analysis (UM and random regression first order reaction norm models for six environmental variables: standard deviations of herd-year (RRMw and herd-year-season-management (RRMw-m groups for mean W450, standard deviations of herd-year (RRMg and herd-year-season-management (RRMg-m groups adjusted for 365-450 days weight gain (G450 averages, and two iterative algorithms using herd-year-season-management group solution estimates from a first RRMw-m and RRMg-m analysis (RRMITw-m and RRMITg-m, respectively. The RRM results showed similar tendencies in the variance components and heritability estimates along environmental gradient. Some of the variation among RRM estimates may have been related to the precision of the predictor and to correlations between environmental variables and the likely components of the weight trait. GEI, which was assessed by estimating the genetic correlation surfaces, had values < 0.5 between extreme environments in all models. Regression analyses showed that the correlation between the expected progeny differences for UM and the corresponding differences estimated by RRM was higher in intermediate and favorable environments than in unfavorable environments (p < 0.0001.

  17. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  18. Measuring vigilance decrement using computer vision assisted eye tracking in dynamic naturalistic environments.

    Science.gov (United States)

    Bodala, Indu P; Abbasi, Nida I; Yu Sun; Bezerianos, Anastasios; Al-Nashash, Hasan; Thakor, Nitish V

    2017-07-01

    Eye tracking offers a practical solution for monitoring cognitive performance in real world tasks. However, eye tracking in dynamic environments is difficult due to high spatial and temporal variation of stimuli, needing further and thorough investigation. In this paper, we study the possibility of developing a novel computer vision assisted eye tracking analysis by using fixations. Eye movement data is obtained from a long duration naturalistic driving experiment. Source invariant feature transform (SIFT) algorithm was implemented using VLFeat toolbox to identify multiple areas of interest (AOIs). A new measure called `fixation score' was defined to understand the dynamics of fixation position between the target AOI and the non target AOIs. Fixation score is maximum when the subjects focus on the target AOI and diminishes when they gaze at the non-target AOIs. Statistically significant negative correlation was found between fixation score and reaction time data (r =-0.2253 and pdecrement, the fixation score decreases due to visual attention shifting away from the target objects resulting in an increase in the reaction time.

  19. Construction of a Digital Learning Environment Based on Cloud Computing

    Science.gov (United States)

    Ding, Jihong; Xiong, Caiping; Liu, Huazhong

    2015-01-01

    Constructing the digital learning environment for ubiquitous learning and asynchronous distributed learning has opened up immense amounts of concrete research. However, current digital learning environments do not fully fulfill the expectations on supporting interactive group learning, shared understanding and social construction of knowledge.…

  20. Monte Carlo in radiotherapy: experience in a distributed computational environment

    Science.gov (United States)

    Caccia, B.; Mattia, M.; Amati, G.; Andenna, C.; Benassi, M.; D'Angelo, A.; Frustagli, G.; Iaccarino, G.; Occhigrossi, A.; Valentini, S.

    2007-06-01

    New technologies in cancer radiotherapy need a more accurate computation of the dose delivered in the radiotherapeutical treatment plan, and it is important to integrate sophisticated mathematical models and advanced computing knowledge into the treatment planning (TP) process. We present some results about using Monte Carlo (MC) codes in dose calculation for treatment planning. A distributed computing resource located in the Technologies and Health Department of the Italian National Institute of Health (ISS) along with other computer facilities (CASPUR - Inter-University Consortium for the Application of Super-Computing for Universities and Research) has been used to perform a fully complete MC simulation to compute dose distribution on phantoms irradiated with a radiotherapy accelerator. Using BEAMnrc and GEANT4 MC based codes we calculated dose distributions on a plain water phantom and air/water phantom. Experimental and calculated dose values below ±2% (for depth between 5 mm and 130 mm) were in agreement both in PDD (Percentage Depth Dose) and transversal sections of the phantom. We consider these results a first step towards a system suitable for medical physics departments to simulate a complete treatment plan using remote computing facilities for MC simulations.

  1. Analysis of insulation material deterioration under the LOCA simulated environment on the basis of reaction kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Okada, Sohei; Kusama, Yasuo; Ito, Masayuki; Yagi, Toshiaki; Yoshikawa, Masato (Japan Atomic Energy Research Inst., Takasaki, Gunma. Takasaki Radiation Chemistry Research Establishment)

    1982-12-01

    In the type test of the electric cables installed in reactor containment vessels, it is considerably difficult to perform the testing over a year once in a while to simulate the accidental environment containing radiation and high temperature steam. Two requirements which seem to be more realistic as compared with the above mentioned testing method are inconsistent with each other. To solve this problem, a general rule of deterioration or the expression by an equation is necessary, which enables the extrapolation to show that a short term testing stands on the safety side. The authors have tried to numerically analyze the change of mechanical characteristics of ethylene-propylene rubber (EPR) and Hypalon which are, important as the materials for PH cables (fire-retardant, EP rubber-insulated, chlorosulfonated polyethylene-sheathed cable), in a complex environment of radiation, steam and chemical spray simulating PWR LOCA conditions. In this report, a method is proposed to analyze and estimate the properties by the regression analysis technique on the basis of reaction kinetics, and the analyzed results are described in the order of experiment, analysis method and the results and consideration. The deterioration of the elongation P = e/esub(o) of EPR and Hypalon in the above described complex environment can be represented by the equation - dP/dt = KPsup(n). The exponent n varied in the cases when air is contained or not in that environment, suggesting that the different reactions are dominant in both conditions, respectively. For EPR, n was close to 2 if air was not contained and close to 1 if air was contained in the system.

  2. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  3. Distributed metadata in a high performance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua; Liu, Xuezhao; Tang, Haiying

    2017-07-11

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination that a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.

  4. Ubiquitous fuzzy computing in open ambient intelligence environments

    NARCIS (Netherlands)

    Acampora, G.; Loia, V.

    2006-01-01

    Ambient intelligence (AmI) is considered as the composition of three emergent technologies: ubiquitous computing, ubiquitous communication and intelligent user interfaces. The aim of integration of aforesaid technologies is to make wider the interaction between human beings and information

  5. Human face recognition using eigenface in cloud computing environment

    Science.gov (United States)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  6. Computer-based Role Playing Game Environment for Analogue Electronics

    Directory of Open Access Journals (Sweden)

    Lachlan M MacKinnon

    2009-02-01

    Full Text Available An implementation of a design for a game based virtual learning environment is described. The game is developed for a course in analogue electronics, and the topic is the design of a power supply. This task can be solved in a number of different ways, with certain constraints, giving the students a certain amount of freedom, although the game is designed not to facilitate trial-and-error approach. The use of storytelling and a virtual gaming environment provides the student with the learning material in a MMORPG environment.

  7. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs.

    Directory of Open Access Journals (Sweden)

    Qifan Kuang

    Full Text Available Early and accurate identification of adverse drug reactions (ADRs is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs.In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper.Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  8. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    Science.gov (United States)

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  9. Context-aware Cloud Computing for Personal Learning Environment

    OpenAIRE

    Chen, Feng; Al-Bayatti, Ali Hilal; Siewe, Francois

    2016-01-01

    Virtual learning means to learn from social interactions in a virtual platform that enables people to study anywhere and at any time. Current Virtual Learning Environments (VLEs) are a range of integrated web based applications to support and enhance the education. Normally, VLEs are institution centric; are owned by the institutions and are designed to support formal learning, which do not support lifelong learning. These limitations led to the research of Personal Learning Environments (PLE...

  10. Tacit knowledge in action: basic notions of knowledge sharing in computer supported work environments

    OpenAIRE

    Mackenzie Owen, John

    2001-01-01

    An important characteristic of most computer supported work environments is the distribution of work over individuals or teams in different locations. This leads to what we nowadays call `virtual' environments. In these environments communication between actors is to a large degree mediated, i.e. established through communications media (telephone, fax, computer networks) rather in a face-to-face way. Unfortunately, mediated communication limits the effectiveness of knowledge exchange in virt...

  11. Bridging context management systems for different types of pervasive computing environments

    NARCIS (Netherlands)

    Hesselman, C.E.W.; Benz, Hartmut; Benz, H.P.; Pawar, P.; Liu, F.; Wegdam, M.; Wibbels, Martin; Broens, T.H.F.; Brok, Jacco

    2008-01-01

    A context management system is a distributed system that enables applications to obtain context information about (mobile) users and forms a key component of any pervasive computing environment. Context management systems are however very environment-specific (e.g., specific for home environments)

  12. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    Science.gov (United States)

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  13. Performance Measurements in a High Throughput Computing Environment

    CERN Document Server

    AUTHOR|(CDS)2145966; Gribaudo, Marco

    The IT infrastructures of companies and research centres are implementing new technologies to satisfy the increasing need of computing resources for big data analysis. In this context, resource profiling plays a crucial role in identifying areas where the improvement of the utilisation efficiency is needed. In order to deal with the profiling and optimisation of computing resources, two complementary approaches can be adopted: the measurement-based approach and the model-based approach. The measurement-based approach gathers and analyses performance metrics executing benchmark applications on computing resources. Instead, the model-based approach implies the design and implementation of a model as an abstraction of the real system, selecting only those aspects relevant to the study. This Thesis originates from a project carried out by the author within the CERN IT department. CERN is an international scientific laboratory that conducts fundamental researches in the domain of elementary particle physics. The p...

  14. Providing a computing environment for a high energy physics workshop

    International Nuclear Information System (INIS)

    Nicholls, J.

    1991-03-01

    Although computing facilities have been provided at conferences and workshops remote from the hose institution for some years, the equipment provided has rarely been capable of providing for much more than simple editing and electronic mail over leased lines. This presentation describes the pioneering effort involved by the Computing Department/Division at Fermilab in providing a local computing facility with world-wide networking capability for the Physics at Fermilab in the 1990's workshop held in Breckenridge, Colorado, in August 1989, as well as the enhanced facilities provided for the 1990 Summer Study on High Energy Physics at Snowmass, Colorado, in June/July 1990. Issues discussed include type and sizing of the facilities, advance preparations, shipping, on-site support, as well as an evaluation of the value of the facility to the workshop participants

  15. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  16. A computer program incorporating Pitzer's equations for calculation of geochemical reactions in brines

    Science.gov (United States)

    Plummer, Niel; Parkhurst, D.L.; Fleming, G.W.; Dunkle, S.A.

    1988-01-01

    The program named PHRQPITZ is a computer code capable of making geochemical calculations in brines and other electrolyte solutions to high concentrations using the Pitzer virial-coefficient approach for activity-coefficient corrections. Reaction-modeling capabilities include calculation of (1) aqueous speciation and mineral-saturation index, (2) mineral solubility, (3) mixing and titration of aqueous solutions, (4) irreversible reactions and mineral water mass transfer, and (5) reaction path. The computed results for each aqueous solution include the osmotic coefficient, water activity , mineral saturation indices, mean activity coefficients, total activity coefficients, and scale-dependent values of pH, individual-ion activities and individual-ion activity coeffients , and scale-dependent values of pH, individual-ion activities and individual-ion activity coefficients. A data base of Pitzer interaction parameters is provided at 25 C for the system: Na-K-Mg-Ca-H-Cl-SO4-OH-HCO3-CO3-CO2-H2O, and extended to include largely untested literature data for Fe(II), Mn(II), Sr, Ba, Li, and Br with provision for calculations at temperatures other than 25C. An extensive literature review of published Pitzer interaction parameters for many inorganic salts is given. Also described is an interactive input code for PHRQPITZ called PITZINPT. (USGS)

  17. Kinetics of the high-temperature combustion reactions of dibutylether using composite computational methods

    KAUST Repository

    Rachidi, Mariam El

    2015-01-01

    This paper investigates the high-temperature combustion kinetics of n-dibutyl ether (n-DBE), including unimolecular decomposition, H-abstraction by H, H-migration, and C{single bond}C/C{single bond}O β-scission reactions of the DBE radicals. The energetics of H-abstraction by OH radicals is also studied. All rates are determined computationally using the CBS-QB3 and G4 composite methods in conjunction with conventional transition state theory. The B3LYP/6-311++G(2df,2pd) method is used to optimize the geometries and calculate the frequencies of all reactive species and transition states for use in ChemRate. Some of the rates calculated in this study vary markedly from those obtained for similar reactions of alcohols or alkanes, particularly those pertaining to unimolecular decomposition and β-scission at the α-β C{single bond}C bond. These variations show that analogies to alkanes and alcohols are, in some cases, inappropriate means of estimating the reaction rates of ethers. This emphasizes the need to establish valid rates through computation or experimentation. Such studies are especially important given that ethers exhibit promising biofuel and fuel additive characteristics. © 2014.

  18. On-line computing in a classified environment

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.

    1982-01-01

    Westinghouse Hanford Company (WHC) recently developed a Department of Energy (DOE) approved real-time, on-line computer system to control nuclear material. The system simultaneously processes both classified and unclassified information. Implementation of this system required application of many security techniques. The system has a secure, but user friendly interface. Many software applications protect the integrity of the data base from malevolent or accidental errors. Programming practices ensure the integrity of the computer system software. The audit trail and the reports generation capability record user actions and status of the nuclear material inventory

  19. Reactions of cisplatin with cysteine and methionine at constant pH; a computational study.

    Science.gov (United States)

    Zimmermann, Tomás; Burda, Jaroslav V

    2010-02-07

    Interactions of hydrated cisplatin complexes cis-[Pt(NH(3))(2)Cl(H(2)O)](+) and cis-[Pt(NH(3))(2)(OH)(H(2)O)](+) with cysteine and methionine in an aqueous solution at constant pH were explored using computational methods. Thermodynamic parameters of considered reactions were studied in a broad pH range, taking up to 4 protonation states of each molecule into account. Reaction free energies at constant pH were obtained from standard Gibbs free energies using the Legendre transformation. Solvation free energies and pK(a) values were calculated using the PCM model with UAHF cavities, recently adapted by us for transition metal complexes. The root mean square error of pK(a) values on a set of model platinum complexes and amino acids was equal to 0.74. At pH 7, the transformed Gibbs free energies differ by up to 15 kcal mol(-1) from the Gibbs free energies of model reactions with a constant number of protons. As for cysteine, calculations confirmed a strong preference for kappaS monodenate bonding in a broad pH range. The most stable product of the second reaction step, which proceeds from monodentate to chelate complex, is the kappa(2)S,N coordinated chelate. The reaction with methionine is more complex. In the first step all three considered methionine donor atoms (N, S and O) are thermodynamically preferred products depending on the platinum complex and the pH. This is in accordance with the experimental observation of a pH dependent migration between N and S donor atoms in a chemically related system. The most stable chelates of platinum with methionine are kappa(2)S,N and kappa(2)N,O bonded complexes. The comparison of reaction free energies of both amino acids suggests, that the bidentate methionine ligand can be displaced even by the monodentate cysteine ligand under certain conditions.

  20. Occurrence, dynamics and reactions of organic pollutants in the indoor environment

    Energy Technology Data Exchange (ETDEWEB)

    Salthammer, Tunga [Material Analysis and Indoor Chemistry, Fraunhofer Wilhelm-Klauditz Institut (WKI), Braunschweig (Germany); Bahadir, Muefit [Institut fuer Oekologische Chemie und Abfallanalytik, Technische Universitaet Braunschweig, Braunschweig (Germany)

    2009-06-15

    The indoor environment is a multidisciplinary scientific field involving chemistry, physics, biology, health sciences, architecture, building sciences and civil engineering. The need for reliable assessment of human exposure to indoor pollutants is attracting increasing attention. This, however, requires a detailed understanding of the relevant compounds, their sources, physical and chemical properties, dynamics, reactions, their distribution among the gas phase, airborne particles and settled dust as well as the availability of modern measurement techniques. Building products, furnishings and other indoor materials often emit volatile and semi-volatile organic compounds. With respect to a healthy indoor environment, only low emitting products, which do not influence indoor air quality in a negative way, should be used in a building. Therefore, materials and products for indoor use need to be evaluated for their chemical emissions. This is routinely done in test chambers and cells. Many studies have shown that the types of sources in occupational and residential indoor environments, the spectrum of emitting compounds and the duration of emission cover a wide range. The demand for standardized test methods under laboratory conditions has resulted in several guidelines for determination of emission rates. Furthermore, it has now been recognized that both primary and secondary emissions may affect indoor air quality. The problem may become more dominant when components of different materials can react with each other or when catalytic materials are applied. Such products derived from indoor related reactions may have a negative impact on indoor air quality due to their low odor threshold, health related properties or the formation of ultrafine particles. Several factors can influence the emission characteristics and numerous investigations have shown that indoor chemistry is of particular importance for the indoor related characterization of building product emissions

  1. Cloud Computing E-Communication Services in the University Environment

    Science.gov (United States)

    Babin, Ron; Halilovic, Branka

    2017-01-01

    The use of cloud computing services has grown dramatically in post-secondary institutions in the last decade. In particular, universities have been attracted to the low-cost and flexibility of acquiring cloud software services from Google, Microsoft and others, to implement e-mail, calendar and document management and other basic office software.…

  2. Music Teachers' Experiences in One-to-One Computing Environments

    Science.gov (United States)

    Dorfman, Jay

    2016-01-01

    Ubiquitous computing scenarios such as the one-to-one model, in which every student is issued a device that is to be used across all subjects, have increased in popularity and have shown both positive and negative influences on education. Music teachers in schools that adopt one-to-one models may be inadequately equipped to integrate this kind of…

  3. Formal and Information Learning in a Computer Clubhouse Environment

    Science.gov (United States)

    McDougall, Anne; Lowe, Jenny; Hopkins, Josie

    2004-01-01

    This paper outlines the establishment and running of an after-school Computer Clubhouse, describing aspects of the leadership, mentoring and learning activities undertaken there. Research data has been collected from examination of documents associated with the Clubhouse, interviews with its founders, Director, session leaders and mentors, and…

  4. Computational comparison of quantum-mechanical models for multistep direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1993-01-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmueller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90 Zr(p,p') at 80 MeV, 209 Bi(p,p') at 62 MeV, and 93 Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data

  5. Reaction

    African Journals Online (AJOL)

    abp

    19 oct. 2017 ... Reaction to Mohamed Said Nakhli et al. concerning the article: "When the axillary block remains the only alternative in a 5 year old child". .... Bertini L1, Savoia G, De Nicola A, Ivani G, Gravino E, Albani A et al ... 2010;7(2):101-.

  6. Computer simulation of the steam--graphite reaction under isothermal and steady-state conditions

    International Nuclear Information System (INIS)

    Joy, D.S.; Stem, S.C.

    1975-05-01

    A mathematical model was formulated to describe the isothermal, steady-state diffusion and reaction of steam in a graphite matrix. A generalized Langmuir-Hinshelwood equation is used to represent the steam-graphite reaction rate. The model also includes diffusion in the gas phase adjacent to the graphite matrix. A computer program, written to numerically integrate the resulting differential equations, is described. The coupled nonlinear differential equations in the graphite phase are solved using the IBM Continuous System Modeling Program. Classical finite difference techniques are used for the gas-phase calculations. An iterative procedure is required to couple the two sets of calculations. Several sample problems are presented to demonstrate the utility of the model. (U.S.)

  7. DCHAIN: A user-friendly computer program for radioactive decay and reaction chain calculations

    International Nuclear Information System (INIS)

    East, L.V.

    1994-05-01

    A computer program for calculating the time-dependent daughter populations in radioactive decay and nuclear reaction chains is described. Chain members can have non-zero initial populations and be produced from the preceding chain member as the result of radioactive decay, a nuclear reaction, or both. As presently implemented, chains can contain up to 15 members. Program input can be supplied interactively or read from ASCII data files. Time units for half-lives, etc. can be specified during data entry. Input values are verified and can be modified if necessary, before used in calculations. Output results can be saved in ASCII files in a format suitable for including in reports or other documents. The calculational method, described in some detail, utilizes a generalized form of the Bateman equations. The program is written in the C language in conformance with current ANSI standards and can be used on multiple hardware platforms

  8. Orion Exploration Flight Test Reaction Control System Jet Interaction Heating Environment from Flight Data

    Science.gov (United States)

    White, Molly E.; Hyatt, Andrew J.

    2016-01-01

    The Orion Multi-Purpose Crew Vehicle (MPCV) Reaction Control System (RCS) is critical to guide the vehicle along the desired trajectory during re-­-entry. However, this system has a significant impact on the convective heating environment to the spacecraft. Heating augmentation from the jet interaction (JI) drives thermal protection system (TPS) material selection and thickness requirements for the spacecraft. This paper describes the heating environment from the RCS on the afterbody of the Orion MPCV during Orion's first flight test, Exploration Flight Test 1 (EFT-1). These jet plumes interact with the wake of the crew capsule and cause an increase in the convective heating environment. Not only is there widespread influence from the jet banks, there may also be very localized effects. The firing history during EFT-1 will be summarized to assess which jet bank interaction was measured during flight. Heating augmentation factors derived from the reconstructed flight data will be presented. Furthermore, flight instrumentation across the afterbody provides the highest spatial resolution of the region of influence of the individual jet banks of any spacecraft yet flown. This distribution of heating augmentation across the afterbody will be derived from the flight data. Additionally, trends with possible correlating parameters will be investigated to assist future designs and ground testing programs. Finally, the challenges of measuring JI, applying this data to future flights and lessons learned will be discussed.

  9. Computational Photophysics in the Presence of an Environment

    Science.gov (United States)

    Nogueira, Juan J.; González, Leticia

    2018-04-01

    Most processes triggered by ultraviolet (UV) or visible (vis) light in nature take place in complex biological environments. The first step in these photophysical events is the excitation of the absorbing system or chromophore to an electronically excited state. Such an excitation can be monitored by the UV-vis absorption spectrum. A precise calculation of the UV-vis spectrum of a chromophore embedded in an environment is a challenging task that requires the consideration of several ingredients, besides an accurate electronic-structure method for the excited states. Two of the most important are an appropriate description of the interactions between the chromophore and the environment and accounting for the vibrational motion of the whole system. In this contribution, we review the most common theoretical methodologies to describe the environment (including quantum mechanics/continuum and quantum mechanics/molecular mechanics models) and to account for vibrational sampling (including Wigner sampling and molecular dynamics). Further, we illustrate in a series of examples how the lack of these ingredients can lead to a wrong interpretation of the electronic features behind the UV-vis absorption spectrum.

  10. Multiscale Computing with the Multiscale Modeling Library and Runtime Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Groen, D.; Ben Belgacem, M.; Kurowski, K.; Hoekstra, A.G.

    2013-01-01

    We introduce a software tool to simulate multiscale models: the Multiscale Coupling Library and Environment 2 (MUSCLE 2). MUSCLE 2 is a component-based modeling tool inspired by the multiscale modeling and simulation framework, with an easy-to-use API which supports Java, C++, C, and Fortran. We

  11. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  12. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  13. Computational study of chain transfer to monomer reactions in high-temperature polymerization of alkyl acrylates.

    Science.gov (United States)

    Moghadam, Nazanin; Liu, Shi; Srinivasan, Sriraj; Grady, Michael C; Soroush, Masoud; Rappe, Andrew M

    2013-03-28

    This article presents a computational study of chain transfer to monomer (CTM) reactions in self-initiated high-temperature homopolymerization of alkyl acrylates (methyl, ethyl, and n-butyl acrylate). Several mechanisms of CTM are studied. The effects of the length of live polymer chains and the type of monoradical that initiated the live polymer chains on the energy barriers and rate constants of the involved reaction steps are investigated theoretically. All calculations are carried out using density functional theory. Three types of hybrid functionals (B3LYP, X3LYP, and M06-2X) and four basis sets (6-31G(d), 6-31G(d,p), 6-311G(d), and 6-311G(d,p)) are applied to predict the molecular geometries of the reactants, products and transition sates, and energy barriers. Transition state theory is used to estimate rate constants. The results indicate that abstraction of a hydrogen atom (by live polymer chains) from the methyl group in methyl acrylate, the methylene group in ethyl acrylate, and methylene groups in n-butyl acrylate are the most likely mechanisms of CTM. Also, the rate constants of CTM reactions calculated using M06-2X are in good agreement with those estimated from polymer sample measurements using macroscopic mechanistic models. The rate constant values do not change significantly with the length of live polymer chains. Abstraction of a hydrogen atom by a tertiary radical has a higher energy barrier than abstraction by a secondary radical, which agrees with experimental findings. The calculated and experimental NMR spectra of dead polymer chains produced by CTM reactions are comparable. This theoretical/computational study reveals that CTM occurs most likely via hydrogen abstraction by live polymer chains from the methyl group of methyl acrylate and methylene group(s) of ethyl (n-butyl) acrylate.

  14. Catalytic Upgrading of Biomass-Derived Compounds via C-C Coupling Reactions. Computational and Experimental Studies of Acetaldehyde and Furan Reactions in HZSM-5

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong [Argonne National Lab. (ANL), Argonne, IL (United States); Evans, Tabitha J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cheng, Lei [Argonne National Lab. (ANL), Argonne, IL (United States); Nimlos, Mark R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukarakate, Calvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Robichaud, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Assary, Rajeev S. [Argonne National Lab. (ANL), Argonne, IL (United States); Curtiss, Larry A. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-02

    These catalytic C–C coupling and deoxygenation reactions are essential for upgrading of biomass-derived oxygenates to fuel-range hydrocarbons. Detailed understanding of mechanistic and energetic aspects of these reactions is crucial to enabling and improving the catalytic upgrading of small oxygenates to useful chemicals and fuels. Using periodic density functional theory (DFT) calculations, we have investigated the reactions of furan and acetaldehyde in an HZSM-5 zeolite catalyst, a representative system associated with the catalytic upgrading of pyrolysis vapors. Comprehensive energy profiles were computed for self-reactions (i.e., acetaldehyde coupling and furan coupling) and cross-reactions (i.e., acetaldehyde + furan) of this representative mixture. Major products proposed from the computations are further confirmed using temperature controlled mass spectra measurements. Moreover, the computational results show that furan interacts with acetaldehyde in HZSM-5 via an alkylation mechanism, which is more favorable than the self-reactions, indicating that mixing furans with aldehydes could be a promising approach to maximize effective C–C coupling and dehydration while reducing the catalyst deactivation (e.g., coke formation) from aldehyde condensation.

  15. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  16. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    Science.gov (United States)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  17. When three traits make a line: evolution of phenotypic plasticity and genetic assimilation through linear reaction norms in stochastic environments.

    Science.gov (United States)

    Ergon, T; Ergon, R

    2017-03-01

    Genetic assimilation emerges from selection on phenotypic plasticity. Yet, commonly used quantitative genetics models of linear reaction norms considering intercept and slope as traits do not mimic the full process of genetic assimilation. We argue that intercept-slope reaction norm models are insufficient representations of genetic effects on linear reaction norms and that considering reaction norm intercept as a trait is unfortunate because the definition of this trait relates to a specific environmental value (zero) and confounds genetic effects on reaction norm elevation with genetic effects on environmental perception. Instead, we suggest a model with three traits representing genetic effects that, respectively, (i) are independent of the environment, (ii) alter the sensitivity of the phenotype to the environment and (iii) determine how the organism perceives the environment. The model predicts that, given sufficient additive genetic variation in environmental perception, the environmental value at which reaction norms tend to cross will respond rapidly to selection after an abrupt environmental change, and eventually becomes equal to the new mean environment. This readjustment of the zone of canalization becomes completed without changes in genetic correlations, genetic drift or imposing any fitness costs of maintaining plasticity. The asymptotic evolutionary outcome of this three-trait linear reaction norm generally entails a lower degree of phenotypic plasticity than the two-trait model, and maximum expected fitness does not occur at the mean trait values in the population. © 2016 The Authors. Journal of Evolutionary Biology published by John Wiley & Sons Ltd on behalf of European Society for Evolutionary Biology.

  18. Multi-Language Programming Environments for High Performance Java Computing

    OpenAIRE

    Vladimir Getov; Paul Gray; Sava Mintchev; Vaidy Sunderam

    1999-01-01

    Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI) tool which provides ...

  19. A FUNCTIONAL MODEL OF COMPUTER-ORIENTED LEARNING ENVIRONMENT OF A POST-DEGREE PEDAGOGICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Kateryna R. Kolos

    2014-06-01

    Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.

  20. SWAAM-LT: The long-term, sodium/water reaction analysis method computer code

    International Nuclear Information System (INIS)

    Shin, Y.W.; Chung, H.H.; Wiedermann, A.H.; Tanabe, H.

    1993-01-01

    The SWAAM-LT Code, developed for analysis of long-term effects of sodium/water reactions, is discussed. The theoretical formulation of the code is described, including the introduction of system matrices for ease of computer programming as a general system code. Also, some typical results of the code predictions for available large scale tests are presented. Test data for the steam generator design with the cover-gas feature and without the cover-gas feature are available and analyzed. The capabilities and limitations of the code are then discussed in light of the comparison between the code prediction and the test data

  1. Computational Investigation of the Competition between the Concerted Diels-Alder Reaction and Formation of Diradicals in Reactions of Acrylonitrile with Non-Polar Dienes

    Science.gov (United States)

    James, Natalie C.; Um, Joann M.; Padias, Anne B.; Hall, H. K.; Houk, K. N.

    2013-01-01

    The energetics of the Diels-Alder cycloaddition reactions of several 1,3-dienes with acrylonitrile, and the energetics of formation of diradicals, were investigated with density functional theory (B3LYP and M06-2X) and compared to experimental data. For the reaction of 2,3-dimethyl-1,3-butadiene with acrylonitrile, the concerted reaction is favored over the diradical pathway by 2.5 kcal/mol using B3LYP/6-31G(d); experimentally this reaction gives both cycloadduct and copolymer. The concerted cycloaddition of cyclopentadiene with acrylonitrile is preferred computationally over the stepwise pathway by 5.9 kcal/mol; experimentally, only the Diels-Alder adduct is formed. For the reactions of (E)-1,3-pentadiene and acrylonitrile, both cycloaddition and copolymerization were observed experimentally; these trends were mimicked by the computational results, which showed only a 1.2 kcal/mol preference for the concerted pathway. For the reactions of (Z)-1,3-pentadiene and acrylonitrile, the stepwise pathway is preferred by 3.9 kcal/mol, in agreement with previous experimental findings that only polymerization occurs. M06-2X is known to give more accurate activation and reaction energetics but the energies of diradicals are too high. PMID:23758325

  2. A scalable computational framework for establishing long-term behavior of stochastic reaction networks.

    Directory of Open Access Journals (Sweden)

    Ankit Gupta

    2014-06-01

    Full Text Available Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed.

  3. A Real-Time Reaction Obstacle Avoidance Algorithm for Autonomous Underwater Vehicles in Unknown Environments.

    Science.gov (United States)

    Yan, Zheping; Li, Jiyun; Zhang, Gengshi; Wu, Yi

    2018-02-02

    A novel real-time reaction obstacle avoidance algorithm (RRA) is proposed for autonomous underwater vehicles (AUVs) that must adapt to unknown complex terrains, based on forward looking sonar (FLS). To accomplish this algorithm, obstacle avoidance rules are planned, and the RRA processes are split into five steps Introduction only lists 4 so AUVs can rapidly respond to various environment obstacles. The largest polar angle algorithm (LPAA) is designed to change detected obstacle's irregular outline into a convex polygon, which simplifies the obstacle avoidance process. A solution is designed to solve the trapping problem existing in U-shape obstacle avoidance by an outline memory algorithm. Finally, simulations in three unknown obstacle scenes are carried out to demonstrate the performance of this algorithm, where the obtained obstacle avoidance trajectories are safety, smooth and near-optimal.

  4. Synthesis of antimicrobial silver nanoparticles through a photomediated reaction in an aqueous environment

    Directory of Open Access Journals (Sweden)

    Banasiuk R

    2016-01-01

    Full Text Available Rafał Banasiuk,1,* Joanna E Frackowiak,2,* Marta Krychowiak,1 Marta Matuszewska,1 Anna Kawiak,1 Magdalena Ziabka,3 Zofia Lendzion-Bielun,4 Magdalena Narajczyk,5 Aleksandra Krolicka1 1Department of Biotechnology, Intercollegiate Faculty of Biotechnology, University of Gdansk and Medical University of Gdansk, 2Department of Pathophysiology, Medical University of Gdansk, Gdansk, 3Faculty of Materials Science and Ceramics, Department of Ceramics and Refractories, AGH-University of Science and Technology, Kraków, 4Institute of Chemical and Environment Engineering, West Pomeranian University of Technology, Szczecin, 5Faculty of Biology, Laboratory of Electron Microscopy, University of Gdansk, Gdansk, Poland *These authors contributed equally to this work Abstract: A fast, economical, and reproducible method for nanoparticle synthesis has been developed in our laboratory. The reaction is performed in an aqueous environment and utilizes light emitted by commercially available 1 W light-emitting diodes (λ =420 nm as the catalyst. This method does not require nanoparticle seeds or toxic chemicals. The irradiation process is carried out for a period of up to 10 minutes, significantly reducing the time required for synthesis as well as environmental impact. By modulating various reaction parameters silver nanoparticles were obtained, which were predominantly either spherical or cubic. The produced nanoparticles demonstrated strong antimicrobial activity toward the examined bacterial strains. Additionally, testing the effect of silver nanoparticles on the human keratinocyte cell line and human peripheral blood mononuclear cells revealed that their cytotoxicity may be limited by modulating the employed concentrations of nanoparticles. Keywords: antimicrobial activity, green synthesis, nanocubes, nanospheres 

  5. Gas-to-particle conversion in the atmospheric environment by radiation-induced and photochemical reactions

    International Nuclear Information System (INIS)

    Vohra, K.G.

    1975-01-01

    During the last few years a fascinating new area of research involving ionizing radiations and photochemistry in gas-to-particle conversion in the atmosphere has been developing at a rapid pace. Two problems of major interest and concern in which this is of paramount importance are: (1) radiation induced and photochemical aerosol formation in the stratosphere and, (2) role of radiations and photochemistry in smog formation. The peak in cosmic ray intensity and significant solar UV flux in the stratosphere lead to complex variety of reactions involving major and trace constituents in this region of the atmosphere, and some of these reactions are of vital importance in aerosol formation. The problem is of great current interest because the pollutant gases from industrial sources and future SST operations entering the stratosphere could increase the aerosol burden in the stratosphere and affect the solar energy input of the troposphere with consequent ecological and climatic changes. On the other hand, in the nuclear era, the atmospheric releases from reactors and processing plants could lead to changes in the cloud nucleation behaviour of the environment and possible increase in smog formation in the areas with significant levels of radiations and conventional pollutants. A review of the earlier work, current status of the problem, and conventional pollutants. A review of the earlier work, current status of the problem, and some recent results of the experiments conducted in the author's laboratory are presented. The possible mechanisms of gas-to-particle conversion in the atmosphere have been explained

  6. Analysis of the potential geochemical reactions in the Enceladus' hydrothermal environment

    Science.gov (United States)

    Ramirez-Cabañas, A. K.; Flandes, A.

    2017-12-01

    Enceladus is the sixth largest moon of Saturn and differs from its other moons, because of its cryovolcanic geysers that emanate from its south pole. The instruments of the Cassini spacecraft reveal different compounds in the gases and the dust of the geysers, such as salts (sodium chloride, sodium bicarbonate and/or sodium carbonate), as well as silica traces (Postberg et al., 2008, 2009) that could be the result of a hydrothermal environment (Hsu et al., 2014, Sekine et al., 2014). By means of a thermodynamic analysis, we propose and evaluate potential geochemical reactions that could happen from the interaction between the nucleus surface and the inner ocean of Enceladus. These reactions may well lead to the origin of the compounds found in the geysers. From this analysis, we propose that, at least, two minerals must be present in the condritic nucleus of Enceladus: olivines (fayalite and fosterite) and feldspar (orthoclase and albite). Subsequently, taking as reference the hydrothermal processes that take place on Earth, we propose the different stages of a potential hydrothermal scenario for Enceladus.

  7. New computer and communications environments for light armored vehicles

    Science.gov (United States)

    Rapanotti, John L.; Palmarini, Marc; Dumont, Marc

    2002-08-01

    Light Armoured Vehicles (LAVs) are being developed to meet the modern requirements of rapid deployment and operations other than war. To achieve these requirements, passive armour is minimized and survivability depends more on sensors, computers and countermeasures to detect and avoid threats. The performance, reliability, and ultimately the cost of these components, will be determined by the trends in computing and communications. These trends and the potential impact on DAS (Defensive Aids Suite) development were investigated and are reported in this paper. Vehicle performance is affected by communication with other vehicles and other ISTAR (Intelligence, Surveillance, Target Acquisition and Reconnaissance) battlefield assets. This investigation includes the networking technology Jini developed by SUN Microsystems, which can be used to interface the vehicle to the ISTAR network. VxWorks by Wind River Systems, is a real time operating system designed for military systems and compatible with Jini. Other technologies affecting computer hardware development include, dynamic reconfiguration, hot swap, alternate pathing, CompactPCI, and Fiber Channel serial communication. To achieve the necessary performance at reasonable cost, and over the long service life of the vehicle, a DAS should have two essential features. A fitted for, but not fitted with approach will provide the necessary rapid deployment without a need to equip the entire fleet. With an expected vehicle service life of 50 years, 5-year technology upgrades can be used to maintain vehicle performance over the entire service life. A federation of modules instead of integrated fused sensors will provide the capability for incremental upgrades and mission configurability. A plug and play capability can be used for both hardware and expendables.

  8. Understanding the Offender/Environment Dynamic for Computer Crimes

    DEFF Research Database (Denmark)

    Willison, Robert Andrew

    2005-01-01

    practices by possiblyhighlighting new areas for safeguard implementation. To help facilitate a greaterunderstanding of the offender/environment dynamic, this paper assesses the feasibilityof applying criminological theory to the IS security context. More specifically, threetheories are advanced, which focus...... on the offender's behaviour in a criminal setting. Drawing on an account of the Barings Bank collapse, events highlighted in the casestudy are used to assess whether concepts central to the theories are supported by thedata. It is noted that while one of the theories is to be found wanting in terms ofconceptual...

  9. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  10. Computational and experimental studies on stabilities, reactions and reaction rates of cations and ion-dipole complexes

    NARCIS (Netherlands)

    Ervasti, H.K.

    2008-01-01

    In this thesis, ion stability, ion-molecule reactions and reaction rates are studied using mass spectrometry and molecular modelling. In Chapter 2 the effect of functional group substitution on neutral and ionised ketene are studied. Electron-donating substituents show a stabilising positive

  11. On some limitations of reaction-diffusion chemical computers in relation to Voronoi diagram and its inversion

    International Nuclear Information System (INIS)

    Adamatzky, Andrew; Lacy Costello, Benjamin de

    2003-01-01

    A reaction-diffusion chemical computer in this context is a planar uniform chemical reactor, where data and results of a computation are represented by concentration profiles of reactants and the computation itself is implemented via the spreading and interaction of diffusive and phase waves. This class of chemical computers are efficient at solving problems with a 'natural' parallelism where data sets are decomposable onto a large number of geographically neighboring domains which are then processed in parallel. Typical problems of this type include image processing, geometrical transformations and optimisation. When chemical based devices are used to solve such problems questions regarding their reproducible, efficiency and the accuracy of their computations arise. In addition to these questions what are the limitations of reaction-diffusion chemical processors--what type of problems cannot currently and are unlikely ever to be solved? To answer the questions we study how a Voronoi diagram is constructed and how it is inverted in a planar chemical processor. We demonstrate that a Voronoi diagram is computed only partially in the chemical processor. We also prove that given a specific Voronoi diagram it is impossible to reconstruct the planar set (from which diagram was computed) in the reaction-diffusion chemical processor. In the Letter we open the first ever line of enquiry into the computational inability of reaction-diffusion chemical computers

  12. Tablet computers and eBooks. Unlocking the potential for personal learning environments?

    NARCIS (Netherlands)

    Kalz, Marco

    2012-01-01

    Kalz, M. (2012, 9 May). Tablet computers and eBooks. Unlocking the potential for personal learning environments? Invited presentation during the annual conference of the European Association for Distance Learning (EADL), Noordwijkerhout, The Netherlands.

  13. Deception Detection in a Computer-Mediated Environment: Gender, Trust, and Training Issues

    National Research Council Canada - National Science Library

    Dziubinski, Monica

    2003-01-01

    .... This research draws on communication and deception literature to develop a conceptual model proposing relationships between deception detection abilities in a computer-mediated environment, gender, trust, and training...

  14. Computer code PRECIP-II for the calculation of Zr-steam reaction

    International Nuclear Information System (INIS)

    Suzuki, Motoye; Kawasaki, Satoru; Furuta, Teruo

    1978-06-01

    The computer code PRECIP-II developed, a modification of S.Malang's SIMTRAN-I, is to calculate Zr-Steam reaction under LOCA conditions. Improved are the following: 1. treatment of boundary conditions at alpha/beta phase interface during temperature decrease. 2. method of time-mesh control. 3. number of input-controllable parameters, and output format. These improvements made possible physically reasonable calculations for an increased number of temperature history patterns, including the cladding temperature excursion assumed during LOCA. Calculations were made along various transient temperature histories, with the parameters so modified as to enable fitting of numerical results of weight gain, oxide thickness and alpha phase thickness in isothermal reactions to the experimental data. Then the computed results were compared with the corresponding experimental values, which revealed that most of the differences lie within +-10%. Slow cooling effect on ductility change of Zircaloy-4 was investigated with some of the oxidized specimens by a ring compression test; the effect is only slight. (auth.)

  15. A Comparative Study of Load Balancing Algorithms in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Cloud Computing is a new trend emerging in IT environment with huge requirements of infrastructure and resources. Load Balancing is an important aspect of cloud computing environment. Efficient load balancing scheme ensures efficient resource utilization by provisioning of resources to cloud users on demand basis in pay as you say manner. Load Balancing may even support prioritizing users by applying appropriate scheduling criteria. This paper presents various load balancing schemes in differ...

  16. Computer investigations on the asymptotic behavior of the rate coefficient for the annihilation reaction A + A → product and the trapping reaction in three dimensions.

    Science.gov (United States)

    Litniewski, Marek; Gorecki, Jerzy

    2011-06-28

    We have performed intensive computer simulations of the irreversible annihilation reaction: A + A → C + C and of the trapping reaction: A + B → C + B for a variety of three-dimensional fluids composed of identical spherical particles. We have found a significant difference in the asymptotic behavior of the rate coefficients for these reactions. Both the rate coefficients converge to the same value with time t going to infinity but the convergence rate is different: the O(t(-1/2)) term for the annihilation reaction is higher than the corresponding term for the trapping reaction. The simulation results suggest that ratio of the terms is a universal quantity with the value equal to 2 or slightly above. A model for the annihilation reaction based on the superposition approximation predicts the difference in the O(t(-1/2)) terms, but overestimates the value for the annihilation reaction by about 30%. We have also performed simulations for the dimerization process: A + A → E, where E stands for a dimer. The dimerization decreases the reaction rate due to the decrease in the diffusion constant for A. The effect is successfully predicted by a simple model.

  17. A synthetic computational environment: To control the spread of respiratory infections in a virtual university

    Science.gov (United States)

    Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong

    2018-02-01

    Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.

  18. Bridging Theory and Practice: Developing Guidelines to Facilitate the Design of Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Lisa D. Young

    2003-10-01

    Full Text Available Abstract. The design of computer-based learning environments has undergone a paradigm shift; moving students away from instruction that was considered to promote technical rationality grounded in objectivism, to the application of computers to create cognitive tools utilized in constructivist environments. The goal of the resulting computer-based learning environment design principles is to have students learn with technology, rather than from technology. This paper reviews the general constructivist theory that has guided the development of these environments, and offers suggestions for the adaptation of modest, generic guidelines, not mandated principles, that can be flexibly applied and allow for the expression of true constructivist ideals in online learning environments.

  19. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  20. CH(+) Destruction by Reaction with H: Computing Quantum Rates To Model Different Molecular Regions in the Interstellar Medium.

    Science.gov (United States)

    Bovino, S; Grassi, T; Gianturco, F A

    2015-12-17

    A detailed analysis of an ionic reaction that plays a crucial role in the carbon chemistry of the interstellar medium (ISM) is carried out by computing ab initio reactive cross sections with a quantum method and by further obtaining the corresponding CH(+) destruction rates over a range of temperatures that shows good overall agreement with existing experiments. The differences found between all existing calculations and the very-low-T experiments are discussed and explored via a simple numerical model that links these cross section reductions to collinear approaches where nonadiabatic crossing is expected to dominate. The new rates are further linked to a complex chemical network that models the evolution of the CH(+) abundance in the photodissociation region (PDR) and molecular cloud (MC) environments of the ISM. The abundances of CH(+) are given by numerical solutions of a large set of coupled, first-order kinetics equations that employs our new chemical package krome. The analysis that we carry out reveals that the important region for CH(+) destruction is that above 100 K, hence showing that, at least for this reaction, the differences with the existing laboratory low-T experiments are of essentially no importance within the astrochemical environments discussed here because, at those temperatures, other chemical processes involving the title molecule are taking over. A detailed analysis of the chemical network involving CH(+) also shows that a slight decrease in the initial oxygen abundance might lead to higher CH(+) abundances because the main chemical carbon ion destruction channel is reduced in efficiency. This might provide an alternative chemical route to understand the reason why general astrochemical models fail when the observed CH(+) abundances are matched with the outcomes of their calculations.

  1. Weighted Local Active Pixel Pattern (WLAPP for Face Recognition in Parallel Computation Environment

    Directory of Open Access Journals (Sweden)

    Gundavarapu Mallikarjuna Rao

    2013-10-01

    Full Text Available Abstract  - The availability of multi-core technology resulted totally new computational era. Researchers are keen to explore available potential in state of art-machines for breaking the bearer imposed by serial computation. Face Recognition is one of the challenging applications on so ever computational environment. The main difficulty of traditional Face Recognition algorithms is lack of the scalability. In this paper Weighted Local Active Pixel Pattern (WLAPP, a new scalable Face Recognition Algorithm suitable for parallel environment is proposed.  Local Active Pixel Pattern (LAPP is found to be simple and computational inexpensive compare to Local Binary Patterns (LBP. WLAPP is developed based on concept of LAPP. The experimentation is performed on FG-Net Aging Database with deliberately introduced 20% distortion and the results are encouraging. Keywords — Active pixels, Face Recognition, Local Binary Pattern (LBP, Local Active Pixel Pattern (LAPP, Pattern computing, parallel workers, template, weight computation.  

  2. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    Science.gov (United States)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  3. Service ORiented Computing EnviRonment (SORCER) for Deterministic Global and Stochastic Optimization

    OpenAIRE

    Raghunath, Chaitra

    2015-01-01

    With rapid growth in the complexity of large scale engineering systems, the application of multidisciplinary analysis and design optimization (MDO) in the engineering design process has garnered much attention. MDO addresses the challenge of integrating several different disciplines into the design process. Primary challenges of MDO include computational expense and poor scalability. The introduction of a distributed, collaborative computational environment results in better...

  4. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    Science.gov (United States)

    Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…

  5. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    NARCIS (Netherlands)

    Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =

  6. Maintaining Traceability in an Evolving Distributed Computing Environment

    Science.gov (United States)

    Collier, I.; Wartel, R.

    2015-12-01

    The management of risk is fundamental to the operation of any distributed computing infrastructure. Identifying the cause of incidents is essential to prevent them from re-occurring. In addition, it is a goal to contain the impact of an incident while keeping services operational. For response to incidents to be acceptable this needs to be commensurate with the scale of the problem. The minimum level of traceability for distributed computing infrastructure usage is to be able to identify the source of all actions (executables, file transfers, pilot jobs, portal jobs, etc.) and the individual who initiated them. In addition, sufficiently fine-grained controls, such as blocking the originating user and monitoring to detect abnormal behaviour, are necessary for keeping services operational. It is essential to be able to understand the cause and to fix any problems before re-enabling access for the user. The aim is to be able to answer the basic questions who, what, where, and when concerning any incident. This requires retaining all relevant information, including timestamps and the digital identity of the user, sufficient to identify, for each service instance, and for every security event including at least the following: connect, authenticate, authorize (including identity changes) and disconnect. In traditional grid infrastructures (WLCG, EGI, OSG etc.) best practices and procedures for gathering and maintaining the information required to maintain traceability are well established. In particular, sites collect and store information required to ensure traceability of events at their sites. With the increased use of virtualisation and private and public clouds for HEP workloads established procedures, which are unable to see 'inside' running virtual machines no longer capture all the information required. Maintaining traceability will at least involve a shift of responsibility from sites to Virtual Organisations (VOs) bringing with it new requirements for their

  7. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  8. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  9. High performance computing environment for multidimensional image analysis.

    Science.gov (United States)

    Rao, A Ravishankar; Cecchi, Guillermo A; Magnasco, Marcelo

    2007-07-10

    The processing of images acquired through microscopy is a challenging task due to the large size of datasets (several gigabytes) and the fast turnaround time required. If the throughput of the image processing stage is significantly increased, it can have a major impact in microscopy applications. We present a high performance computing (HPC) solution to this problem. This involves decomposing the spatial 3D image into segments that are assigned to unique processors, and matched to the 3D torus architecture of the IBM Blue Gene/L machine. Communication between segments is restricted to the nearest neighbors. When running on a 2 Ghz Intel CPU, the task of 3D median filtering on a typical 256 megabyte dataset takes two and a half hours, whereas by using 1024 nodes of Blue Gene, this task can be performed in 18.8 seconds, a 478x speedup. Our parallel solution dramatically improves the performance of image processing, feature extraction and 3D reconstruction tasks. This increased throughput permits biologists to conduct unprecedented large scale experiments with massive datasets.

  10. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    International Nuclear Information System (INIS)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-01-01

    This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three

  11. Mathematical Language Development and Talk Types in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Symons, Duncan; Pierce, Robyn

    2015-01-01

    In this study we examine the use of cumulative and exploratory talk types in a year 5 computer supported collaborative learning environment. The focus for students in this environment was to participate in mathematical problem solving, with the intention of developing the proficiencies of problem solving and reasoning. Findings suggest that…

  12. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  13. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  14. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  15. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  16. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers’ Touch-Interface User Experiences

    OpenAIRE

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users’ shopping behavior. In this research, I examine the underlying mechanisms between input device environments and shoppers’ decision-making processes. In particular, I investigate the impact of input d...

  17. Anaerobic Microbial Degradation of Hydrocarbons: From Enzymatic Reactions to the Environment.

    Science.gov (United States)

    Rabus, Ralf; Boll, Matthias; Heider, Johann; Meckenstock, Rainer U; Buckel, Wolfgang; Einsle, Oliver; Ermler, Ulrich; Golding, Bernard T; Gunsalus, Robert P; Kroneck, Peter M H; Krüger, Martin; Lueders, Tillmann; Martins, Berta M; Musat, Florin; Richnow, Hans H; Schink, Bernhard; Seifert, Jana; Szaleniec, Maciej; Treude, Tina; Ullmann, G Matthias; Vogt, Carsten; von Bergen, Martin; Wilkes, Heinz

    2016-01-01

    Hydrocarbons are abundant in anoxic environments and pose biochemical challenges to their anaerobic degradation by microorganisms. Within the framework of the Priority Program 1319, investigations funded by the Deutsche Forschungsgemeinschaft on the anaerobic microbial degradation of hydrocarbons ranged from isolation and enrichment of hitherto unknown hydrocarbon-degrading anaerobic microorganisms, discovery of novel reactions, detailed studies of enzyme mechanisms and structures to process-oriented in situ studies. Selected highlights from this program are collected in this synopsis, with more detailed information provided by theme-focused reviews of the special topic issue on 'Anaerobic biodegradation of hydrocarbons' [this issue, pp. 1-244]. The interdisciplinary character of the program, involving microbiologists, biochemists, organic chemists and environmental scientists, is best exemplified by the studies on alkyl-/arylalkylsuccinate synthases. Here, research topics ranged from in-depth mechanistic studies of archetypical toluene-activating benzylsuccinate synthase, substrate-specific phylogenetic clustering of alkyl-/arylalkylsuccinate synthases (toluene plus xylenes, p-cymene, p-cresol, 2-methylnaphthalene, n-alkanes), stereochemical and co-metabolic insights into n-alkane-activating (methylalkyl)succinate synthases to the discovery of bacterial groups previously unknown to possess alkyl-/arylalkylsuccinate synthases by means of functional gene markers and in situ field studies enabled by state-of-the-art stable isotope probing and fractionation approaches. Other topics are Mo-cofactor-dependent dehydrogenases performing O2-independent hydroxylation of hydrocarbons and alkyl side chains (ethylbenzene, p-cymene, cholesterol, n-hexadecane), degradation of p-alkylated benzoates and toluenes, glycyl radical-bearing 4-hydroxyphenylacetate decarboxylase, novel types of carboxylation reactions (for acetophenone, acetone, and potentially also benzene and

  18. Linear reaction norm models for genetic merit prediction of Angus cattle under genotype by environment interaction.

    Science.gov (United States)

    Cardoso, F F; Tempelman, R J

    2012-07-01

    The objectives of this work were to assess alternative linear reaction norm (RN) models for genetic evaluation of Angus cattle in Brazil. That is, we investigated the interaction between genotypes and continuous descriptors of the environmental variation to examine evidence of genotype by environment interaction (G×E) in post-weaning BW gain (PWG) and to compare the environmental sensitivity of national and imported Angus sires. Data were collected by the Brazilian Angus Improvement Program from 1974 to 2005 and consisted of 63,098 records and a pedigree file with 95,896 animals. Six models were implemented using Bayesian inference and compared using the Deviance Information Criterion (DIC). The simplest model was M(1), a traditional animal model, which showed the largest DIC and hence the poorest fit when compared with the 4 alternative RN specifications accounting for G×E. In M(2), a 2-step procedure was implemented using the contemporary group posterior means of M(1) as the environmental gradient, ranging from -92.6 to +265.5 kg. Moreover, the benefits of jointly estimating all parameters in a 1-step approach were demonstrated by M(3). Additionally, we extended M(3) to allow for residual heteroskedasticity using an exponential function (M(4)) and the best fitting (smallest DIC) environmental classification model (M(5)) specification. Finally, M(6) added just heteroskedastic residual variance to M(1). Heritabilities were less at harsh environments and increased with the improvement of production conditions for all RN models. Rank correlations among genetic merit predictions obtained by M(1) and by the best fitting RN models M(3) (homoskedastic) and M(5) (heteroskedastic) at different environmental levels ranged from 0.79 and 0.81, suggesting biological importance of G×E in Brazilian Angus PWG. These results suggest that selection progress could be optimized by adopting environment-specific genetic merit predictions. The PWG environmental sensitivity of

  19. Image selection as a service for cloud computing environments

    KAUST Repository

    Filepp, Robert

    2010-12-01

    Customers of Cloud Services are expected to choose specific machine images to instantiate in order to host their workloads. Unfortunately very little information is provided to the users to enable them to make intelligent choices. We believe that as the number of images proliferates it will become increasingly difficult for users to decide effectively. Cloud service providers often allow their customers to instantiate standard system images, to modify their instances, and to store images of these customized instances for public or private future use. Storing modified instances as images enables customers to avoid re-provisioning and re-configuration of required resources thereby reducing their future costs. However Cloud service providers generally do not expose details regarding the configurations of the images in a rigorous canonical fashion nor offer services that assist clients in the best target image selection to support client transformation objectives. Rather, they allow customers to enter a free-form description of an image based on client\\'s best effort. This means in order to find a "best fit" image to instantiate, a human user must review potentially thousands of image descriptions, reading each description to evaluate its suitability as a platform to host their source application. Furthermore, the actual content of the selected image may differ greatly from its description. Finally, even images that have been customized and retained for future use may need additional provisioning and customization to accommodate specific needs. In this paper we propose a service that accumulates image configuration details in a canonical fashion and a further service that employs an algorithm to order images per best fit /least cost in conformance to user-specified policies. These services collectively facilitate workload transformation into enterprise cloud environments.

  20. A PC/workstation cluster computing environment for reservoir engineering simulation applications

    International Nuclear Information System (INIS)

    Hermes, C.E.; Koo, J.

    1995-01-01

    Like the rest of the petroleum industry, Texaco has been transferring its applications and databases from mainframes to PC's and workstations. This transition has been very positive because it provides an environment for integrating applications, increases end-user productivity, and in general reduces overall computing costs. On the down side, the transition typically results in a dramatic increase in workstation purchases and raises concerns regarding the cost and effective management of computing resources in this new environment. The workstation transition also places the user in a Unix computing environment which, to say the least, can be quite frustrating to learn and to use. This paper describes the approach, philosophy, architecture, and current status of the new reservoir engineering/simulation computing environment developed at Texaco's E and P Technology Dept. (EPTD) in Houston. The environment is representative of those under development at several other large oil companies and is based on a cluster of IBM and Silicon Graphics Intl. (SGI) workstations connected by a fiber-optics communications network and engineering PC's connected to local area networks, or Ethernets. Because computing resources and software licenses are shared among a group of users, the new environment enables the company to get more out of its investments in workstation hardware and software

  1. Urbancontext: A Management Model For Pervasive Environments In User-Oriented Urban Computing

    Directory of Open Access Journals (Sweden)

    Claudia L. Zuniga-Canon

    2014-01-01

    Full Text Available Nowadays, urban computing has gained a lot of interest for guiding the evolution of citiesinto intelligent environments. These environments are appropriated for individuals’ inter-actions changing in their behaviors. These changes require new approaches that allow theunderstanding of how urban computing systems should be modeled.In this work we present UrbanContext, a new model for designing of urban computingplatforms that applies the theory of roles to manage the individual’s context in urban envi-ronments. The theory of roles helps to understand the individual’s behavior within a socialenvironment, allowing to model urban computing systems able to adapt to individuals statesand their needs.UrbanContext collects data in urban atmospheres and classifies individuals’ behaviorsaccording to their change of roles, to optimize social interaction and offer secure services.Likewise, UrbanContext serves as a generic model to provide interoperability, and to facilitatethe design, implementation and expansion of urban computing systems.

  2. E-pharmacovigilance: development and implementation of a computable knowledge base to identify adverse drug reactions.

    Science.gov (United States)

    Neubert, Antje; Dormann, Harald; Prokosch, Hans-Ulrich; Bürkle, Thomas; Rascher, Wolfgang; Sojer, Reinhold; Brune, Kay; Criegee-Rieck, Manfred

    2013-09-01

    Computer-assisted signal generation is an important issue for the prevention of adverse drug reactions (ADRs). However, due to poor standardization of patients' medical data and a lack of computable medical drug knowledge the specificity of computerized decision support systems for early ADR detection is too low and thus those systems are not yet implemented in daily clinical practice. We report on a method to formalize knowledge about ADRs based on the Summary of Product Characteristics (SmPCs) and linking them with structured patient data to generate safety signals automatically and with high sensitivity and specificity. A computable ADR knowledge base (ADR-KB) that inherently contains standardized concepts for ADRs (WHO-ART), drugs (ATC) and laboratory test results (LOINC) was built. The system was evaluated in study populations of paediatric and internal medicine inpatients. A total of 262 different ADR concepts related to laboratory findings were linked to 212 LOINC terms. The ADR knowledge base was retrospectively applied to a study population of 970 admissions (474 internal and 496 paediatric patients), who underwent intensive ADR surveillance. The specificity increased from 7% without ADR-KB up to 73% in internal patients and from 19.6% up to 91% in paediatric inpatients, respectively. This study shows that contextual linkage of patients' medication data with laboratory test results is a useful and reasonable instrument for computer-assisted ADR detection and a valuable step towards a systematic drug safety process. The system enables automated detection of ADRs during clinical practice with a quality close to intensive chart review. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  3. Nuclides.net: An integrated environment for computations on radionuclides and their radiation

    International Nuclear Information System (INIS)

    Galy, J.; Magill, J.

    2002-01-01

    Full text: The Nuclides.net computational package is of direct interest in the fields of environment monitoring and nuclear forensics. The 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclide chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The main emphasis in Nuclides.net is on nuclear science applications, such as health physics, radioprotection and radiochemistry, rather than nuclear data for which excellent sources already exist. In contrast to the CD-based Nuclides 2000 predecessor, Nuclides.net applications run over the internet on a web server. The user interface to these applications is via a web browser. Information submitted by the user is sent to the appropriate applications resident on the web server. The results of the calculations are returned to the user, again via the browser. The product is aimed at both students and professionals for reference data on radionuclides and computations based on this data using the latest internet technology. It is particularly suitable for educational purposes in the nuclear industry, health physics and radiation protection, nuclear and radiochemistry, nuclear physics, astrophysics, etc. The Nuclides.net software suite contains the following modules/features: a) A new user interface to view the nuclide charts (with zoom features). Additional nuclide charts are based on spin, parity, binding energy etc. b) There are five main applications: (1) 'Decay Engine' for decay calculations of numbers, masses, activities, dose rates, etc. of parent and daughters. (2) 'Dosimetry and Shielding' module allows the calculation of dose rates from both unshielded and shielded point sources. A choice of 10 shield materials is available. (3) 'Virtual Nuclides' allows the user to do decay and dosimetry and shielding calculations on mixtures of

  4. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  5. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  6. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  7. Applications of computer simulation, nuclear reactions and elastic scattering to surface analysis of materials

    Directory of Open Access Journals (Sweden)

    Pacheco de Carvalho, J. A.

    2008-08-01

    Full Text Available This article involves computer simulation and surface analysis by nuclear techniques, which are non-destructive. Both the “energy method of analysis” for nuclear reactions and elastic scattering are used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. The method is successfully applied to thick flat targets of graphite, quartz and sapphire and targets containing thin films of aluminium oxide. Depth profiles of 12C and 16O nuclei are determined using (d,p and (d,α deuteron induced reactions. Rutherford and resonance elastic scattering of (4He+ ions are also used.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energia” para reacciones nucleares, así como el de difusión elástica. Se simulan en ordenador espectros en energía que se comparan com datos experimentales, de lo que resulta la obención de información sobre la composición y los perfiles de concentración de la muestra. Este método se aplica con éxito em muestras espesas y planas de grafito, cuarzo y zafiro y muestras conteniendo películas finas de óxido de aluminio. Se calculan perfiles en profundidad de núcleos de 12C y de 16O a través de reacciones (d,p y (d,α inducidas por deuterones. Se utiliza también la difusión elástica de iones (4He+, tanto a Rutherford como resonante.

  8. STEEP4 code for computation of specific thermonuclear reaction rates from pointwise cross sections

    International Nuclear Information System (INIS)

    Harris, D.R.; Dei, D.E.; Husseiny, A.A.; Sabri, Z.A.; Hale, G.M.

    1976-05-01

    A code module, STEEP4, is developed to calculate the fusion reaction rates in terms of the specific reactivity [sigma v] which is the product of cross section and relative velocity averaged over the actual ion distributions of the interacting particles in the plasma. The module is structured in a way suitable for incorporation in thermonuclear burn codes to provide rapid and yet relatively accurate on-line computation of [sigma v] as a function of plasma parameters. Ion distributions are modified to include slowing-down contributions which are characterized in terms of plasma parameters. Rapid and accurate algorithms are used for integrating [sigma v] from cross sections and spectra. The main program solves for [sigma v] by the method of steepest descent. However, options are provided to use Gauss-Hermite and dense trapezoidal quadrature integration techniques. Options are also provided for rapid calculation of screening effects on specific reaction rates. Although such effects are not significant in cases of plasmas of laboratory interest, the options are included to increase the range of applicability of the code. Gamow penetration form, log-log interpolation, and cubic interpolation routines are included to provide the interpolated values of cross sections

  9. A priori modeling of chemical reactions on computational grid platforms: Workflows and data models

    International Nuclear Information System (INIS)

    Rampino, S.; Monari, A.; Rossi, E.; Evangelisti, S.; Laganà, A.

    2012-01-01

    Graphical abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS assembled on the European Grid allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Highlights: ► The grid based GEMS simulator accurately models small chemical systems. ► Q5Cost and D5Cost file formats provide interoperability in the workflow. ► Benchmark runs on H + H 2 highlight the Grid empowering. ► O + O 2 and N + N 2 calculated k (T)’s fall within the error bars of the experiment. - Abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS has been assembled on the segment of the European Grid devoted to the Computational Chemistry Virtual Organization. The related grid based workflow allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Interoperability between computational codes across the different stages of the workflow was made possible by the use of the common data formats Q5Cost and D5Cost. Illustrative benchmark runs have been performed on the prototype H + H 2 , N + N 2 and O + O 2 gas phase exchange reactions and thermal rate coefficients have been calculated for the last two. Results are discussed in terms of the modeling of the interaction and advantages of using the Grid is highlighted.

  10. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  11. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  12. A visualization environment for supercomputing-based applications in computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.

    1993-06-01

    In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

  13. Acorn: A grid computing system for constraint based modeling and visualization of the genome scale metabolic reaction networks via a web interface

    Directory of Open Access Journals (Sweden)

    Bushell Michael E

    2011-05-01

    Full Text Available Abstract Background Constraint-based approaches facilitate the prediction of cellular metabolic capabilities, based, in turn on predictions of the repertoire of enzymes encoded in the genome. Recently, genome annotations have been used to reconstruct genome scale metabolic reaction networks for numerous species, including Homo sapiens, which allow simulations that provide valuable insights into topics, including predictions of gene essentiality of pathogens, interpretation of genetic polymorphism in metabolic disease syndromes and suggestions for novel approaches to microbial metabolic engineering. These constraint-based simulations are being integrated with the functional genomics portals, an activity that requires efficient implementation of the constraint-based simulations in the web-based environment. Results Here, we present Acorn, an open source (GNU GPL grid computing system for constraint-based simulations of genome scale metabolic reaction networks within an interactive web environment. The grid-based architecture allows efficient execution of computationally intensive, iterative protocols such as Flux Variability Analysis, which can be readily scaled up as the numbers of models (and users increase. The web interface uses AJAX, which facilitates efficient model browsing and other search functions, and intuitive implementation of appropriate simulation conditions. Research groups can install Acorn locally and create user accounts. Users can also import models in the familiar SBML format and link reaction formulas to major functional genomics portals of choice. Selected models and simulation results can be shared between different users and made publically available. Users can construct pathway map layouts and import them into the server using a desktop editor integrated within the system. Pathway maps are then used to visualise numerical results within the web environment. To illustrate these features we have deployed Acorn and created a

  14. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  15. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  16. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    Science.gov (United States)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  17. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  18. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  19. Dynamical structure analysis of crystalline-state reaction and elucidation of chemical reactivity in crystalline environment

    International Nuclear Information System (INIS)

    Ohashi, Yuji

    2010-01-01

    It was found that a chiral alkyl group bonded to the cobalt atom in a cobalt complex crystal was racemized with retention of the single crystal form on exposure to visible light. Such reactions, which are called crystalline-state reactions, have been found in a variety of cobalt complex crystals. The concept of reaction cavity was introduced to explain the reaction rate quantitatively and the chirality of the photo-product. The new diffractometers and detectors were made for rapid data collection. The reaction mechanism was also elucidated using neutron diffraction analysis. The unstable reaction intermediates were analyzed using cryo-trapping method. The excited-state structures were obtained at the equilibrium state between ground and excited states. (author)

  20. Aqueous complexation, precipitation, and adsorption reactions of cadmium in the geologic environment

    International Nuclear Information System (INIS)

    Zachara, J.M.; Rai, D.; Felmy, A.R.; Cowan, C.E.; Smith, S.C.; Moore, D.A.; Resch, C.T.

    1992-06-01

    This report contains new laboratory data and equilibrium constants for important solubility and adsorption reactions of Cd that occur in soil and groundwater and attenuate Cd migration. In addition, extensive interaction experiments with Cd and soils from electric utility sites are described. These experiments show the importance of precipitation and adsorption reactions in soil and demonstrate how such reactions can be modeled to predict Cd attenuation near utility sites

  1. The Use of Computer Simulation to Compare Student performance in Traditional versus Distance Learning Environments

    Directory of Open Access Journals (Sweden)

    Retta Guy

    2015-06-01

    Full Text Available Simulations have been shown to be an effective tool in traditional learning environments; however, as distance learning grows in popularity, the need to examine simulation effectiveness in this environment has become paramount. A casual-comparative design was chosen for this study to determine whether students using a computer-based instructional simulation in hybrid and fully online environments learned better than traditional classroom learners. The study spans a period of 6 years beginning fall 2008 through spring 2014. The population studied was 281 undergraduate business students self-enrolled in a 200-level microcomputer application course. The overall results support previous studies in that computer simulations are most effective when used as a supplement to face-to-face lectures and in hybrid environments.

  2. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  3. COMPUTATIONAL MODELS USED FOR MINIMIZING THE NEGATIVE IMPACT OF ENERGY ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Oprea D.

    2012-04-01

    Full Text Available Optimizing energy system is a problem that is extensively studied for many years by scientists. This problem can be studied from different views and using different computer programs. The work is characterized by one of the following calculation methods used in Europe for modelling, power system optimization. This method shall be based on reduce action of energy system on environment. Computer program used and characterized in this article is GEMIS.

  4. ReputationPro: The Efficient Approaches to Contextual Transaction Trust Computation in E-Commerce Environments

    OpenAIRE

    Zhang, Haibin; Wang, Yan; Zhang, Xiuzhen; Lim, Ee-Peng

    2013-01-01

    In e-commerce environments, the trustworthiness of a seller is utterly important to potential buyers, especially when the seller is unknown to them. Most existing trust evaluation models compute a single value to reflect the general trust level of a seller without taking any transaction context information into account. In this paper, we first present a trust vector consisting of three values for Contextual Transaction Trust (CTT). In the computation of three CTT values, the identified three ...

  5. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  6. Nuclides.net: A computational environment for nuclear data and applications in radioprotection and radioecology

    International Nuclear Information System (INIS)

    Berthou, V.; Galy, J.; Leutzenkirchen, K.

    2004-01-01

    An interactive multimedia tool, Nuclides.net, has been developed at the Institute for Transuranium Elements. The Nuclides.net 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclides chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The product is particularly suitable for environmental radioprotection and radioecology. (authors)

  7. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    OpenAIRE

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s mul...

  8. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  9. Towards the Automatic Detection of Efficient Computing Assets in a Heterogeneous Cloud Environment

    OpenAIRE

    Iglesias, Jesus Omana; Stokes, Nicola; Ventresque, Anthony; Murphy, Liam, B.E.; Thorburn, James

    2013-01-01

    peer-reviewed In a heterogeneous cloud environment, the manual grading of computing assets is the first step in the process of configuring IT infrastructures to ensure optimal utilization of resources. Grading the efficiency of computing assets is however, a difficult, subjective and time consuming manual task. Thus, an automatic efficiency grading algorithm is highly desirable. In this paper, we compare the effectiveness of the different criteria used in the manual gr...

  10. Evaluation of the acute adverse reaction of contrast medium with high and moderate iodine concentration in patients undergoing computed tomography

    International Nuclear Information System (INIS)

    Nagamoto, Masashi; Gomi, Tatsuya; Terada, Hitoshi; Terada, Shigehiko; Kohda, Eiichi

    2006-01-01

    The aim of this prospective study was to evaluate and compare acute adverse reactions between contrast medium containing moderate and high concentrations of iodine in patients undergoing computed tomography (CT). A total of 945 patients undergoing enhanced CT were randomly assigned to receive one of two doses of contrast medium. We then prospectively investigated the incidence of adverse reactions. Iopamidol was used as the contrast medium, with a high concentration of 370 mgI/ml and a moderate concentration of 300 mgI/ml. The frequency of adverse reactions, such as pain at the injection site and heat sensation, were determined. Acute adverse reactions were observed in 2.4% (11/458) of the moderate-concentration group compared to 3.11% (15/482) of the high-concentration group; there was no significant difference in incidence between the two groups. Most adverse reactions were mild, and there was no significant difference in severity. One patient in the high-concentration group was seen to have a moderate adverse reaction. No correlation existed between the incidence of adverse reactions and patient characteristics such as sex, age, weight, flow amount, and flow rate. The incidence of pain was not significantly different between the two groups. In contrast, the incidence of heat sensation was significantly higher in the high-concentration group. The incidence and severity of acute adverse reactions were not significantly different between the two groups, and there were no severe adverse reactions in either group. (author)

  11. Development and implementation of a critical pathway for prevention of adverse reactions to contrast media for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Keun Jo [Presbyterian Medical Center, Seoul (Korea, Republic of); Kweon, Dae Cheol; Kim, Myeong Goo [Seoul National University Hospital, Seoul (Korea, Republic of); Yoo, Beong Gyu [Wonkwang Health Science College, Iksan (Korea, Republic of)

    2007-03-15

    The purpose of this study is to develop a critical pathway (CP) for the prevention of adverse reactions to contrast media for computed tomography. The CP was developed and implemented by a multidisciplinary group is Seoul National University Hospital. The CP was applied to CT patients. Patients who underwent CT scanning were included in the CP group from March in 2004. The satisfaction of the patients with CP was compared with non-CP groups. We also investigated the degree of satisfaction among the radiological technologists and nurses. The degree of patient satisfaction with the care process increased patient information (24%), prevention of adverse reactions to contrast media (19%), pre-cognitive effect of adverse reactions to contrast media (39%) and information degree of adverse reactions to contrast media (19%). This CP program can be used as one of the patient care tools for reducing the adverse reactions to contrast media and increasing the efficiency of care process in CT examination settings.

  12. Development and implementation of a critical pathway for prevention of adverse reactions to contrast media for computed tomography

    International Nuclear Information System (INIS)

    Jang, Keun Jo; Kweon, Dae Cheol; Kim, Myeong Goo; Yoo, Beong Gyu

    2007-01-01

    The purpose of this study is to develop a critical pathway (CP) for the prevention of adverse reactions to contrast media for computed tomography. The CP was developed and implemented by a multidisciplinary group is Seoul National University Hospital. The CP was applied to CT patients. Patients who underwent CT scanning were included in the CP group from March in 2004. The satisfaction of the patients with CP was compared with non-CP groups. We also investigated the degree of satisfaction among the radiological technologists and nurses. The degree of patient satisfaction with the care process increased patient information (24%), prevention of adverse reactions to contrast media (19%), pre-cognitive effect of adverse reactions to contrast media (39%) and information degree of adverse reactions to contrast media (19%). This CP program can be used as one of the patient care tools for reducing the adverse reactions to contrast media and increasing the efficiency of care process in CT examination settings

  13. Computational investigation of kinetics of cross-linking reactions in proteins: importance in structure prediction.

    Science.gov (United States)

    Bandyopadhyay, Pradipta; Kuntz, Irwin D

    2009-01-01

    The determination of protein structure using distance constraints is a new and promising field of study. One implementation involves attaching residues of a protein using a cross-linking agent, followed by protease digestion, analysis of the resulting peptides by mass spectroscopy, and finally sequence threading to detect the protein folds. In the present work, we carry out computational modeling of the kinetics of cross-linking reactions in proteins using the master equation approach. The rate constants of the cross-linking reactions are estimated using the pKas and the solvent-accessible surface areas of the residues involved. This model is tested with fibroblast growth factor (FGF) and cytochrome C. It is consistent with the initial experimental rate data for individual lysine residues for cytochrome C. Our model captures all observed cross-links for FGF and almost 90% of the observed cross-links for cytochrome C, although it also predicts cross-links that were not observed experimentally (false positives). However, the analysis of the false positive results is complicated by the fact that experimental detection of cross-links can be difficult and may depend on specific experimental conditions such as pH, ionic strength. Receiver operator characteristic plots showed that our model does a good job in predicting the observed cross-links. Molecular dynamics simulations showed that for cytochrome C, in general, the two lysines come closer for the observed cross-links as compared to the false positive ones. For FGF, no such clear pattern exists. The kinetic model and MD simulation can be used to study proposed cross-linking protocols.

  14. Self-propagating exothermic reaction analysis in Ti/Al reactive films using experiments and computational fluid dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Seema, E-mail: seema.sen@tu-ilmenau.de [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany); Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Lake, Markus; Kroppen, Norman; Farber, Peter; Wilden, Johannes [Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Schaaf, Peter [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany)

    2017-02-28

    Highlights: • Development of nanoscale Ti/Al multilayer films with 1:1, 1:2 and 1:3 molar ratios. • Characterization of exothermic reaction propagation by experiments and simulation. • The reaction velocity depends on the ignition potentials and molar ratios of the films. • Only 1Ti/3Al films exhibit the unsteady reaction propagation with ripple formation. • CFD simulation shows the time dependent atom mixing and temperature flow during exothermic reaction. - Abstract: This study describes the self-propagating exothermic reaction in Ti/Al reactive multilayer foils by using experiments and computational fluid dynamics simulation. The Ti/Al foils with different molar ratios of 1Ti/1Al, 1Ti/2Al and 1Ti/3Al were fabricated by magnetron sputtering method. Microstructural characteristics of the unreacted and reacted foils were analyzed by using electronic and atomic force microscopes. After an electrical ignition, the influence of ignition potentials on reaction propagation has been experimentally investigated. The reaction front propagates with a velocity of minimum 0.68 ± 0.4 m/s and maximum 2.57 ± 0.6 m/s depending on the input ignition potentials and the chemical compositions. Here, the 1Ti/3Al reactive foil exhibits both steady state and unsteady wavelike reaction propagation. Moreover, the numerical computational fluid dynamics (CFD) simulation shows the time dependent temperature flow and atomic mixing in a nanoscale reaction zone. The CFD simulation also indicates the potentiality for simulating exothermic reaction in the nanoscale Ti/Al foil.

  15. Spectroscopic and computational studies of ionic clusters as models of solvation and atmospheric reactions

    Science.gov (United States)

    Kuwata, Keith T.

    Ionic clusters are useful as model systems for the study of fundamental processes in solution and in the atmosphere. Their structure and reactivity can be studied in detail using vibrational predissociation spectroscopy, in conjunction with high level ab initio calculations. This thesis presents the applications of infrared spectroscopy and computation to a variety of gas-phase cluster systems. A crucial component of the process of stratospheric ozone depletion is the action of polar stratospheric clouds (PSCs) to convert the reservoir species HCl and chlorine nitrate (ClONO2) to photochemically labile compounds. Quantum chemistry was used to explore one possible mechanism by which this activation is effected: Cl- + ClONO2 /to Cl2 + NO3- eqno(1)Correlated ab initio calculations predicted that the direct reaction of chloride ion with ClONO2 is facile, which was confirmed in an experimental kinetics study. In the reaction a weakly bound intermediate Cl2-NO3- is formed, with ~70% of the charge localized on the nitrate moiety. This enables the Cl2-NO3- cluster to be well solvated even in bulk solution, allowing (1) to be facile on PSCs. Quantum chemistry was also applied to the hydration of nitrosonium ion (NO+), an important process in the ionosphere. The calculations, in conjunction with an infrared spectroscopy experiment, revealed the structure of the gas-phase clusters NO+(H2O)n. The large degree of covalent interaction between NO+ and the lone pairs of the H2O ligands is contrasted with the weak electrostatic bonding between iodide ion and H2O. Finally, the competition between ion solvation and solvent self-association is explored for the gas-phase clusters Cl/-(H2O)n and Cl-(NH3)n. For the case of water, vibrational predissociation spectroscopy reveals less hydrogen bonding among H2O ligands than predicted by ab initio calculations. Nevertheless, for n /ge 5, cluster structure is dominated by water-water interactions, with Cl- only partially solvated by the

  16. Computer-assisted design for scaling up systems based on DNA reaction networks.

    Science.gov (United States)

    Aubert, Nathanaël; Mosca, Clément; Fujii, Teruo; Hagiya, Masami; Rondelez, Yannick

    2014-04-06

    In the past few years, there have been many exciting advances in the field of molecular programming, reaching a point where implementation of non-trivial systems, such as neural networks or switchable bistable networks, is a reality. Such systems require nonlinearity, be it through signal amplification, digitalization or the generation of autonomous dynamics such as oscillations. The biochemistry of DNA systems provides such mechanisms, but assembling them in a constructive manner is still a difficult and sometimes counterintuitive process. Moreover, realistic prediction of the actual evolution of concentrations over time requires a number of side reactions, such as leaks, cross-talks or competitive interactions, to be taken into account. In this case, the design of a system targeting a given function takes much trial and error before the correct architecture can be found. To speed up this process, we have created DNA Artificial Circuits Computer-Assisted Design (DACCAD), a computer-assisted design software that supports the construction of systems for the DNA toolbox. DACCAD is ultimately aimed to design actual in vitro implementations, which is made possible by building on the experimental knowledge available on the DNA toolbox. We illustrate its effectiveness by designing various systems, from Montagne et al.'s Oligator or Padirac et al.'s bistable system to new and complex networks, including a two-bit counter or a frequency divider as well as an example of very large system encoding the game Mastermind. In the process, we highlight a variety of behaviours, such as enzymatic saturation and load effect, which would be hard to handle or even predict with a simpler model. We also show that those mechanisms, while generally seen as detrimental, can be used in a positive way, as functional part of a design. Additionally, the number of parameters included in these simulations can be large, especially in the case of complex systems. For this reason, we included the

  17. Computational Laboratory Astrophysics to Enable Transport Modeling of Protons and Hydrogen in Stellar Winds, the ISM, and other Astrophysical Environments

    Science.gov (United States)

    Schultz, David

    As recognized prominently by the APRA program, interpretation of NASA astrophysical mission observations requires significant products of laboratory astrophysics, for example, spectral lines and transition probabilities, electron-, proton-, or heavy-particle collision data. Availability of these data underpin robust and validated models of astrophysical emissions and absorptions, energy, momentum, and particle transport, dynamics, and reactions. Therefore, measured or computationally derived, analyzed, and readily available laboratory astrophysics data significantly enhances the scientific return on NASA missions such as HST, Spitzer, and JWST. In the present work a comprehensive set of data will be developed for the ubiquitous proton-hydrogen and hydrogen-hydrogen collisions in astrophysical environments including ISM shocks, supernova remnants and bubbles, HI clouds, young stellar objects, and winds within stellar spheres, covering the necessary wide range of energy- and charge-changing channels, collision energies, and most relevant scattering parameters. In addition, building on preliminary work, a transport and reaction simulation will be developed incorporating the elastic and inelastic collision data collected and produced. The work will build upon significant previous efforts of the principal investigators and collaborators, will result in a comprehensive data set required for modeling these environments and interpreting NASA astrophysical mission observations, and will benefit from feedback from collaborators who are active users of the work proposed.

  18. Enhanced Survey and Proposal to secure the data in Cloud Computing Environment

    OpenAIRE

    MR.S.SUBBIAH; DR.S.SELVA MUTHUKUMARAN; DR.T.RAMKUMAR

    2013-01-01

    Cloud computing have the power to eliminate the cost of setting high end computing infrastructure. It is a promising area or design to give very flexible architecture, accessible through the internet. In the cloud computing environment the data will be reside at any of the data centers. Due to that, some data center may leak the data stored on there, beyond the reach and control of the users. For this kind of misbehaving data centers, the service providers should take care of the security and...

  19. The Needs of Virtual Machines Implementation in Private Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Edy Kristianto

    2015-12-01

    Full Text Available The Internet of Things (IOT becomes the purpose of the development of information and communication technology. Cloud computing has a very important role in supporting the IOT, because cloud computing allows to provide services in the form of infrastructure (IaaS, platform (PaaS, and Software (SaaS for its users. One of the fundamental services is infrastructure as a service (IaaS. This study analyzed the requirement that there must be based on a framework of NIST to realize infrastructure as a service in the form of a virtual machine to be built in a cloud computing environment.

  20. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  1. Computer modeling of the dynamics of surface tension on rotating fluids in low and microgravity environments

    Science.gov (United States)

    Hung, R. J.; Tsao, Y. D.; Hong, B. B.; Leslie, Fred W.

    1989-01-01

    Time-dependent evolutions of the profile of the free surface (bubble shapes) for a cylindrical container partially filled with a Newtonian fluid of constant density, rotating about its axis of symmetry, have been studied. Numerical computations have been carried out with the following situations: (1) linear functions of spin-up and spin-down in low- and microgravity environments, (2) linear functions of increasing and decreasing gravity environments at high- and low-rotating cylinder speeds, and (3) step functions of spin-up and spin-down in a low-gravity environment.

  2. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  3. [On the influence of local molecular environment on the redox potential of electron transfer cofactors in bacterial photosynthetic reaction centers].

    Science.gov (United States)

    Krasil'nikov, P M; Noks, P P; Rubin, A B

    2011-01-01

    The addition of cryosolvents (glycerol, dimethylsulfoxide) to a water solution containing bacterial photosynthetic reaction centers changes the redox potential of the bacteriochlorophyll dimer, but does not affect the redox potential of the quinone primary acceptor. It has been shown that the change in redox potential can be produced by changes of the electrostatic interactions between cofactors and the local molecular environment modified by additives entered into the solution. The degree of influence of a solvent on the redox potential of various cofactors is determined by degree of availability of these cofactors for molecules of solvent, which depends on the arrangement of cofactors in the structure of reaction centers.

  4. Validating the Accuracy of Reaction Time Assessment on Computer-Based Tablet Devices.

    Science.gov (United States)

    Schatz, Philip; Ybarra, Vincent; Leitner, Donald

    2015-08-01

    Computer-based assessment has evolved to tablet-based devices. Despite the availability of tablets and "apps," there is limited research validating their use. We documented timing delays between stimulus presentation and (simulated) touch response on iOS devices (3rd- and 4th-generation Apple iPads) and Android devices (Kindle Fire, Google Nexus, Samsung Galaxy) at response intervals of 100, 250, 500, and 1,000 milliseconds (ms). Results showed significantly greater timing error on Google Nexus and Samsung tablets (81-97 ms), than Kindle Fire and Apple iPads (27-33 ms). Within Apple devices, iOS 7 obtained significantly lower timing error than iOS 6. Simple reaction time (RT) trials (250 ms) on tablet devices represent 12% to 40% error (30-100 ms), depending on the device, which decreases considerably for choice RT trials (3-5% error at 1,000 ms). Results raise implications for using the same device for serial clinical assessment of RT using tablets, as well as the need for calibration of software and hardware. © The Author(s) 2015.

  5. Theoretical intercomparison of multi-step direct reaction models and computational intercomparison of multi-step direct reaction models

    International Nuclear Information System (INIS)

    Koning, A.J.

    1992-08-01

    In recent years several statistical theories have been developed concerning multistep direct (MSD) nuclear reactions. In addition, dominant in applications is a whole class of semiclassical models that may be subsumed under the heading of 'generalized exciton models'. These are basically MSD-type extensions on top of compound-like concepts. In this report the relationship between their underlying statistical MSD-postulates is highlighted. A command framework is outlined that enables to generate the various MSD theories through assigning statistical properties to different parts of the nuclear Hamiltonian. Then it is shown that distinct forms of nuclear randomness are embodied in the mentioned theories. All these theories appear to be very similar at a qualitative level. In order to explain the high energy-tails and forward-peaked angular distribution typical for particles emitted in MSD reactions, it is imagined that the incident continuum particle stepwise looses its energy and direction in a sequence of collisions, thereby creating new particle-hole pairs in the target system. At each step emission may take place. The statistical aspect comes in because many continuum states are involved in the process. These are supposed to display chaotic behavior, the associated randomness assumption giving rise to important simplifications in the expression for MSD emission cross sections. This picture suggests that mentioned MSD models can be interpreted as a variant of essentially one and the same theory. However, this appears not to be the case. To show this usual MSD distinction within the composite reacting nucleus between the fast continuum particle and the residual interactions, the nucleons of the residual core are to be distinguished from those of the leading particle with the residual system. This distinction will turn out to be crucial to present analysis. 27 refs.; 5 figs.; 1 tab

  6. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  7. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  8. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    Science.gov (United States)

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  9. Computer Graphics Orientation and Training in a Corporate/Production Environment.

    Science.gov (United States)

    McDevitt, Marsha Jean

    This master's thesis provides an overview of a computer graphics production environment and proposes a realistic approach to orientation and on-going training for employees working within a fast-paced production schedule. Problems involved in meeting the training needs of employees are briefly discussed in the first chapter, while the second…

  10. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    Science.gov (United States)

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  11. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    Science.gov (United States)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  12. Understanding Student Retention in Computer Science Education: The Role of Environment, Gains, Barriers and Usefulness

    Science.gov (United States)

    Giannakos, Michail N.; Pappas, Ilias O.; Jaccheri, Letizia; Sampson, Demetrios G.

    2017-01-01

    Researchers have been working to understand the high dropout rates in computer science (CS) education. Despite the great demand for CS professionals, little is known about what influences individuals to complete their CS studies. We identify gains of studying CS, the (learning) environment, degree's usefulness, and barriers as important predictors…

  13. Encountering the Expertise Reversal Effect with a Computer-Based Environment on Electrical Circuit Analysis

    Science.gov (United States)

    Reisslein, Jana; Atkinson, Robert K.; Seeling, Patrick; Reisslein, Martin

    2006-01-01

    This study examined the effectiveness of a computer-based environment employing three example-based instructional procedures (example-problem, problem-example, and fading) to teach series and parallel electrical circuit analysis to learners classified by two levels of prior knowledge (low and high). Although no differences between the…

  14. Detecting and Understanding the Impact of Cognitive and Interpersonal Conflict in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Prata, David Nadler; Baker, Ryan S. J. d.; Costa, Evandro d. B.; Rose, Carolyn P.; Cui, Yue; de Carvalho, Adriana M. J. B.

    2009-01-01

    This paper presents a model which can automatically detect a variety of student speech acts as students collaborate within a computer supported collaborative learning environment. In addition, an analysis is presented which gives substantial insight as to how students' learning is associated with students' speech acts, knowledge that will…

  15. MOO: Using a Computer Gaming Environment to Teach about Community Arts

    Science.gov (United States)

    Garber, Elizabeth

    2004-01-01

    In this paper, the author discusses the use of an interactive computer technology, "MOO" (Multi-user domain, Object-Oriented), in her art education classes for preservice teachers. A MOO is a text-based environment wherein interactivity is centered on text exchanges made between users based on problems or other materials created by teachers. The…

  16. Intramolecular Diels-Alder reactions of pyrimidines, a synthetic and computational study

    NARCIS (Netherlands)

    Stolle, W.A.W.

    1992-01-01

    This thesis deals with an investigation on the ringtransformation reactions of 2and 5-(ω-alkynyl)pyrimidine derivatives, which undergo upon heating an intramolecular Diels-Alder reaction and subsequently a spontaneous retro Diels- Alder reaction. To get a better insight into the

  17. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    International Nuclear Information System (INIS)

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M.

    1995-01-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed

  18. NOSTOS: a paper-based ubiquitous computing healthcare environment to support data capture and collaboration.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2003-01-01

    In this paper, we present a new approach to clinical workplace computerization that departs from the window-based user interface paradigm. NOSTOS is an experimental computer-augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk-up displays, headsets, a smart desk, and sensors to enhance an existing paper-based practice with computer power. The physical interfaces allow clinicians to retain mobile paper-based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper-based clinical work environment.

  19. ReaDDy--a software for particle-based reaction-diffusion dynamics in crowded cellular environments.

    Directory of Open Access Journals (Sweden)

    Johannes Schöneberg

    Full Text Available We introduce the software package ReaDDy for simulation of detailed spatiotemporal mechanisms of dynamical processes in the cell, based on reaction-diffusion dynamics with particle resolution. In contrast to other particle-based reaction kinetics programs, ReaDDy supports particle interaction potentials. This permits effects such as space exclusion, molecular crowding and aggregation to be modeled. The biomolecules simulated can be represented as a sphere, or as a more complex geometry such as a domain structure or polymer chain. ReaDDy bridges the gap between small-scale but highly detailed molecular dynamics or Brownian dynamics simulations and large-scale but little-detailed reaction kinetics simulations. ReaDDy has a modular design that enables the exchange of the computing core by efficient platform-specific implementations or dynamical models that are different from Brownian dynamics.

  20. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  1. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies.

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar

    2017-09-12

    Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the 'accuracy' and 'practicality' of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the

  2. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies

    Directory of Open Access Journals (Sweden)

    Erfan Shahabpoor

    2017-09-01

    Full Text Available Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the ‘accuracy’ and ‘practicality’ of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1 methods based on measured kinematic data; (2 methods based on measured plantar pressure; and (3 methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1 reducing the size and price of tri-axial load-cells; (2 improving the accuracy of orientation measurement using IMUs; (3 minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4 increasing the durability of pressure insole sensors, and (5 enhancing the robustness and

  3. Gas-Phase Reactions of Dimethyl Disulfide with Aliphatic Carbanions - A Mass Spectrometry and Computational Study

    Science.gov (United States)

    Franczuk, Barbara; Danikiewicz, Witold

    2018-03-01

    Ion-molecule reactions of Me2S2 with a wide range of aliphatic carbanions differing by structure and proton affinity values have been studied in the gas phase using mass spectrometry techniques and DFT calculations. The analysis of the spectra shows a variety of product ions formed via different reaction mechanisms, depending on the structure and proton affinity of the carbanion. Product ions of thiophilic reaction ( m/z 47), SN2 ( m/z 79), and E2 elimination - addition sequence of reactions ( m/z 93) can be observed. Primary products of thiophilic reaction can undergo subsequent SN2 and proton transfer reactions. Gibbs free energy profiles calculated for experimentally observed reactions using PBE0/6-311+G(2d,p) method show good agreement with experimental results. [Figure not available: see fulltext.

  4. Reaction Norms in Natural Conditions: How Does Metabolic Performance Respond to Weather Variations in a Small Endotherm Facing Cold Environments?

    Science.gov (United States)

    Petit, Magali; Vézina, François

    2014-01-01

    Reaction norms reflect an organisms' capacity to adjust its phenotype to the environment and allows for identifying trait values associated with physiological limits. However, reaction norms of physiological parameters are mostly unknown for endotherms living in natural conditions. Black-capped chickadees (Poecile atricapillus) increase their metabolic performance during winter acclimatization and are thus good model to measure reaction norms in the wild. We repeatedly measured basal (BMR) and summit (Msum) metabolism in chickadees to characterize, for the first time in a free-living endotherm, reaction norms of these parameters across the natural range of weather variation. BMR varied between individuals and was weakly and negatively related to minimal temperature. Msum varied with minimal temperature following a Z-shape curve, increasing linearly between 24°C and −10°C, and changed with absolute humidity following a U-shape relationship. These results suggest that thermal exchanges with the environment have minimal effects on maintenance costs, which may be individual-dependent, while thermogenic capacity is responding to body heat loss. Our results suggest also that BMR and Msum respond to different and likely independent constraints. PMID:25426860

  5. HEPLIB '91: International users meeting on the support and environments of high energy physics computing

    International Nuclear Information System (INIS)

    Johnstad, H.

    1991-01-01

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, data base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards

  6. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  7. Offspring reaction norms shaped by parental environment: interaction between within- and trans-generational plasticity of inducible defenses.

    Science.gov (United States)

    Luquet, Emilien; Tariel, Juliette

    2016-10-12

    Within-generational plasticity (WGP) and transgenerational plasticity (TGP) are mechanisms allowing rapid adaptive responses to fluctuating environments without genetic change. These forms of plasticity have often been viewed as independent processes. Recent evidence suggests that WGP is altered by the environmental conditions experienced by previous generations (i.e., TGP). In the context of inducible defenses, one of the most studied cases of plasticity, the WGP x TGP interaction has been poorly investigated. We provide evidence that TGP can alter the reaction norms of inducible defenses in a freshwater snail. The WGP x TGP interaction patterns are trait-specific and lead to decreased slope of reaction norms (behaviour and shell thickness). Offspring from induced parents showed a higher predator avoidance behaviour and a thicker shell than snails from non-induced parents in no predator-cue environment while they reached similar defenses in predator-cue environment. The WGP x TGP interaction further lead to a switch from a plastic towards a constitutive expression of defenses for shell dimensions (flat reaction norm). WGP-alteration by TGP may shape the adaptive responses to environmental change and then has a substantial importance to understand the evolution of plasticity.

  8. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  9. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Kulesz, James J [ORNL; Abercrombie, Robert K [ORNL; Kruse, Kara L [ORNL

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  10. The reaction of glass during gamma irradiation in a saturated tuff environment

    International Nuclear Information System (INIS)

    Ebert, W.L.; Bates, J.K.; Gerding, T.J.

    1990-05-01

    The reaction between tuffaceous groundwater and actinide-doped SRL 165 and PNL 76-68 type glasses in a gamma radiation field has been studied at 90 degree C for periods up to 278 days. The primary effect of the radiation field was the acidification of the leachate through the production of nitrogen acids. Acidification of the leachate was limited by bicarbonate in the groundwater, for all exposures tested. Nonirradiated experiments were performed to represent the lowest limit of radiation exposure. Both irradiated and nonirradiated experiments were performed with and without a tuff monolith present in the reaction vessel. Neither irradiation nor the presence of tuff had a major effect on the extent of glass reaction as measured by the leachate concentrations of various glass species or analysis of the reacted glass surfaces. This report discusses the results of leaching experiments performed in a gamma radiation field and in the absence of a radiation field. 28 refs., 47 figs., 11 tabs

  11. Modeling-gas phase reactions in indoor environments using computational fluid dynamics

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Weschler, Charles J.

    2002-01-01

    This CFD modeling study examines the concentrations of two gaseous compounds that react in an indoor setting to produce a hypothetical product. The reactants are ozone and either d-limonene or alpha-terpinene (which reacts with ozone about 40 times faster than d-limonene). In addition to two...... different terpenes, the scenarios include two air exchange rates (0.5 and 2.0 h(-1)). The terpene is introduced as a floor source with an emission pattern similar to a floor-care product. These four scenarios have been set in a fairly large two-dimensional room (13.6 x 40.6 m) with a supply at the top...... of the left wall and an exhaust at the bottom of the right wall. The room has been deliberately scaled so that the Reynolds numbers for key flow regimes match those of a room in which the calculated flow field has been validated against measured data. It has been further assumed that ozone interacts with room...

  12. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  13. Sulphation reactions of oxidic dust particles in waste heat boiler environment. Literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ranki, T.

    1999-09-01

    Sulphation of metal oxides has an important role in many industrial processes. In different applications sulphation reactions have different aims and characteristics. In the flash smelting process sulphation of oxidic flue dust is a spontaneous and inevitable phenomena, which takes place in the waste heat boiler (WHB) when cooling down hot dust laden off-gases from sulphide smelters. Oxidic dust particles (size 0 - 50 {mu}m) react with O{sub 2} and SO{sub 2} or SO{sub 3} in a certain temperature range (500 - 800 deg C). Sulphation reactions are highly exothermic releasing large amount of heat, which affects the gas cooling and thermal performance of the boiler. Thermodynamics and kinetics of the system have to be known to improve the process and WHB operation. The rate of sulphation is affected by the prevailing conditions (temperature, gas composition) and particle size and microstructure (porosity, surface area). Some metal oxides (CuO) can react readily with SO{sub 2} and O{sub 2} and act as self-catalysts, but others (NiO) require the presence of an external catalyst to enhance the SO{sub 3} formation and sulphation to proceed. Some oxides (NiO) sulphate directly, some (CuO) may form first intermediate phases (basic sulphates) depending on the reaction conditions. Thus, the reaction mechanisms are very complex. The aim of this report was to search information about the factors affecting the dust sulphation reactions and suggested reaction mechanisms and kinetics. Many investigators have studied sulphation thermodynamics and reaction kinetics and mechanisms of macroscopical metal oxide pieces, but only few articles have been published about sulphation of microscopical particles, like dust. All the found microscale studies dealt with sulphation reactions of calcium oxide, which is not present in the flash smelting process, but used as an SO{sub 2} absorbent in the combustion processes. However, also these investigations may give some hints about the sulphation

  14. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  15. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  16. Genotype by environment interaction for litter size in pigs as quantified by reaction norms analysis

    DEFF Research Database (Denmark)

    Knap, P W; Su, G

    2008-01-01

    A Bayesian procedure was used to estimate linear reaction norms (i.e. individual G × E plots) on 297 518 litter size records of 121 104 sows, daughters of 2040 sires, recorded on 144 farms in North and Latin America, Europe, Asia and Australia. The method allowed for simultaneous estimation of al...

  17. A computational environment for creating and testing reduced chemical kinetic mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, C.J.; Swensen, D.A.; Harding, T.V.; Cremer, M.A.; Bockelie, M.J. [Reaction Engineering International, Salt Lake City, UT (USA)

    2002-02-01

    This paper describes software called computer assisted reduced mechanism problem solving environment (CARM-PSE) that gives the engineer the ability to rapidly set up, run and examine large numbers of problems comparing detailed and reduced (approximate) chemistry. CARM-PSE integrates the automatic chemical mechanism reduction code CARM and the codes that simulate perfectly stirred reactors and plug flow reactors into a user-friendly computational environment. CARM-PSE gives the combustion engineer the ability to easily test chemical approximations over many hundreds of combinations of inputs in a multidimensional parameter space. The demonstration problems compare detailed and reduced chemical kinetic calculations for methane-air combustion, including nitrogen oxide formation, in a stirred reactor and selective non-catalytic reduction of NOx, in coal combustion flue gas.

  18. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  19. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  20. Computational methodology of sodium-water reaction phenomenon in steam generator of sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira; Uchibori, Akihiro; Ohshima, Hiroyuki

    2009-01-01

    A new computational methodology of sodium-water reaction (SWR), which occurs in a steam generator of a liquid-sodium-cooled fast reactor when a heat transfer tube in the steam generator fails, has been developed considering multidimensional and multiphysics thermal hydraulics. Two kinds of reaction models are proposed in accordance with a phase of sodium as a reactant. One is the surface reaction model in which water vapor reacts directly with liquid sodium at the interface between the liquid sodium and the water vapor. The reaction heat will lead to a vigorous evaporation of liquid sodium, resulting in a reaction of gas-phase sodium. This is designated as the gas-phase reaction model. These two models are coupled with a multidimensional, multicomponent gas, and multiphase thermal hydraulics simulation method with compressibility (named the 'SERAPHIM' code). Using the present methodology, a numerical investigation of the SWR under a pin-bundle configuration (a benchmark analysis of the SWAT-1R experiment) has been carried out. As a result, the maximum gas temperature of approximately 1,300degC is predicted stably, which lies within the range of previous experimental observations. It is also demonstrated that the maximum temperature of the mass weighted average in the analysis agrees reasonably well with the experimental result measured by thermocouples. The present methodology will be promising to establish a theoretical and mechanical modeling of secondary failure propagation of heat transfer tubes due to such as an overheating rupture and a wastage. (author)

  1. Computer aided design, analysis and experimental investigation of membrane assisted batch reaction-separation systems

    DEFF Research Database (Denmark)

    Mitkowski, Piotr Tomasz; Buchaly, Carsten; Kreis, Peter

    2009-01-01

    Membrane assisted batch reaction operation offers an interesting option for equilibrium limited reaction systems in chemical and biochemical manufacturing by selective removal of one of the products and thereby increasing the product yield. The design of such hybrid systems need to take into acco......Membrane assisted batch reaction operation offers an interesting option for equilibrium limited reaction systems in chemical and biochemical manufacturing by selective removal of one of the products and thereby increasing the product yield. The design of such hybrid systems need to take...... into account the performance of each constituent element and the optimisation of the design must take into consideration their interdependency. In this paper use of a membrane, to assist in the synthesis of propyl-propionate is investigated through the use of a hybrid process design framework, which consists...... and separation functionalities and to design/analyse the hybrid scheme. The generated hybrid scheme has been validated through experiments involving an esterification reaction....

  2. The effect of crossmodal congruency between ambient scent and the store environment on consumer reactions

    OpenAIRE

    Adams, Carmen; Doucé, Lieve

    2016-01-01

    Previous research found that ambient scents used by retailers should be pleasant and product congruent. This paper proposes that an ambient scent should also be crossmodally congruent with the store environment. Crossmodal congruency refers to the shared crossmodal correspondences (i.e., tendency of a sensory attribute to be associated with an attribute in another sense) of the ambient scent and the store environment. In this study, a scent crossmodally congruent with the store, a scent cross...

  3. A Drawing and Multi-Representational Computer Environment for Beginners' Learning of Programming Using C: Design and Pilot Formative Evaluation

    Science.gov (United States)

    Kordaki, Maria

    2010-01-01

    This paper presents both the design and the pilot formative evaluation study of a computer-based problem-solving environment (named LECGO: Learning Environment for programming using C using Geometrical Objects) for the learning of computer programming using C by beginners. In its design, constructivist and social learning theories were taken into…

  4. Effects Of Social Networking Sites (SNSs) On Hyper Media Computer Mediated Environments (HCMEs)

    OpenAIRE

    Yoon C. Cho

    2011-01-01

    Social Networking Sites (SNSs) are known as tools to interact and build relationships between users/customers in Hyper Media Computer Mediated Environments (HCMEs). This study explored how social networking sites play a significant role in communication between users. While numerous researchers examined the effectiveness of social networking websites, few studies investigated which factors affected customers attitudes and behavior toward social networking sites. In this paper, the authors inv...

  5. Do Social Computing Make You Happy? A Case Study of Nomadic Children in Mixed Environments

    DEFF Research Database (Denmark)

    Christensen, Bent Guldbjerg

    2005-01-01

    In this paper I describe a perspective on ambient, ubiquitous, and pervasive computing called the happiness perspective. By using the happiness perspective, the application domain and how the technology is used and experienced, becomes a central and integral part of perceiving ambient technology....... will use the perspective in a case study on field test experiments with nomadic children in mixed environments using the eBag system....

  6. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  7. Applications of the pipeline environment for visual informatics and genomics computations

    Directory of Open Access Journals (Sweden)

    Genco Alex

    2011-07-01

    Full Text Available Abstract Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The

  8. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Xi Huang

    2018-01-01

    Full Text Available Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC: real-time communication that takes place between human beings via the instrumentality of computers in forms of text, audio and video communication, such as live chat and chatrooms as socially-oriented meaning construction. This review begins by considering the adoption of social interactionist views to identify key paradigms and supportive principles of computer-supported collaborative learning. A special focus on two components of communicative competence is then presented to explore interactional variables in synchronous computer-mediated communication along with a review of research. There follows a discussion on a synthesis of interactional variables in negotiated interaction and co-construction of knowledge from psycholinguistic and social cohesion perspectives. This review reveals both possibilities and disparities of language socialization in promoting intersubjective learning and diversifying the salient use of interactively creative language in computer-supported collaborative learning environments in service of communicative competence.

  9. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  10. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  11. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  12. Study on User Authority Management for Safe Data Protection in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Su-Hyun Kim

    2015-03-01

    Full Text Available In cloud computing environments, user data are encrypted using numerous distributed servers before storing such data. Global Internet service companies, such as Google and Yahoo, recognized the importance of Internet service platforms and conducted self-research and development to create and utilize large cluster-based cloud computing platform technology based on low-priced commercial nodes. As diverse data services become possible in distributed computing environments, high-capacity distributed management is emerging as a major issue. Meanwhile, because of the diverse forms of using high-capacity data, security vulnerability and privacy invasion by malicious attackers or internal users can occur. As such, when various sensitive data are stored in cloud servers and used from there, the problem of data spill might occur because of external attackers or the poor management of internal users. Data can be managed through encryption to prevent such problems. However, existing simple encryption methods involve problems associated with the management of access to data stored in cloud environments. Therefore, in the present paper, a technique for data access management by user authority, based on Attribute-Based Encryption (ABE and secret distribution techniques, is proposed.

  13. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  14. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  15. The effects of gamma radiation on groundwater chemistry and glass reaction in a saturated tuff environment

    International Nuclear Information System (INIS)

    Ebert, W.L.; Bates, J.K.; Gerding, T.J.; Van Konynenburg, R.A.

    1986-12-01

    The Nevada Nuclear Waste Storage Investigations project has completed a series of experiments that provide insight into groundwater chemistry and glass waste form performance in the presence of a gamma radiation field at 90 0 C. Results from experiments done at 1 x 10 3 and O R/hr are presented and compared to similar experiments done at 2 x 10 5 and 1 x 10 4 R/hr. The major effect of radiation is to lower the groundwater pH to a value near 6.4. The addition of glass to the system results in slightly more basic final pH, both in the presence and absence of radiation. However, there is essentially no difference in the extent of glass reaction, as measured by elemental release, as a function of dose rate or total dose, for reaction periods up to 278 days

  16. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT

  17. A Novel Computational Method to Reduce Leaky Reaction in DNA Strand Displacement

    Directory of Open Access Journals (Sweden)

    Xin Li

    2015-01-01

    Full Text Available DNA strand displacement technique is widely used in DNA programming, DNA biosensors, and gene analysis. In DNA strand displacement, leaky reactions can cause DNA signals decay and detecting DNA signals fails. The mostly used method to avoid leakage is cleaning up after upstream leaky reactions, and it remains a challenge to develop reliable DNA strand displacement technique with low leakage. In this work, we address the challenge by experimentally evaluating the basic factors, including reaction time, ratio of reactants, and ion concentration to the leakage in DNA strand displacement. Specifically, fluorescent probes and a hairpin structure reporting DNA strand are designed to detect the output of DNA strand displacement, and thus can evaluate the leakage of DNA strand displacement reactions with different reaction time, ratios of reactants, and ion concentrations. From the obtained data, mathematical models for evaluating leakage are achieved by curve derivation. As a result, it is obtained that long time incubation, high concentration of fuel strand, and inappropriate amount of ion concentration can weaken leaky reactions. This contributes to a method to set proper reaction conditions to reduce leakage in DNA strand displacement.

  18. Photochemical reaction between triclosan and nitrous acid in the atmospheric aqueous environment

    Science.gov (United States)

    Ma, Jianzhong; Zhu, Chengzhu; Lu, Jun; Lei, Yu; Wang, Jizhong; Chen, Tianhu

    2017-05-01

    Nitrous acid (HONO) is an important tropospheric pollutant and a major source of hydroxyl radical in the atmospheric gas phase. However, studies on the role of HONO in atmospheric aqueous phase chemistry processes are relatively few. The present work investigated the photochemical reaction of HONO with triclosan (TCS), which is an emerging contaminant, using a combination of laser flash photolysis spectrometry and gas chromatography mass spectrometry. With these techniques, the reaction pathway of HONO with TCS was proposed by directly monitoring the transient species and detecting the stable products. ·OH was generated from the photodissociation of the HONO aqueous solution and attacked TCS molecules on different sites to produce the TCS-OH adducts with a second-order rate constant of 1.11 × 109 L mol-1 s-1. The ·OH added a C atom adjacent to the ether bond in the aromatic ring of TCS and self-decayed when the ether bond broke. The intermediates generated from the addition of ·OH to the benzene ring of the TCS molecular structure were immediately nitrated by HONO, which played a key role in the formation process of nitrocompounds. An atmospheric model suggests that the aqueous oxidation of TCS by ·OH is a major reaction at high liquid water concentrations, and the photolysis of TCS dominates under low-humidity conditions.

  19. Hydrogen production from water gas shift reaction in a high gravity (Higee) environment using a rotating packed bed

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Wei-Hsin; Syu, Yu-Jhih [Department of Greenergy, National University of Tainan, Tainan 700 (China)

    2010-10-15

    Hydrogen production via the water gas shift reaction (WGSR) was investigated in a high gravity environment. A rotating packed bed (RPB) reactor containing a Cu-Zn catalyst and spinning in the range of 0-1800 rpm was used to create high centrifugal force. The reaction temperature and the steam/CO ratio ranged from 250 to 350 C and 2 to 8, respectively. A dimensionless parameter, the G number, was derived to account for the effect of centrifugal force on the enhancement of the WGSR. With the rotor speed of 1800 rpm, the induced centrifugal force acting on the reactants was as high as 234 g on average in the RPB. As a result, the CO conversion from the WGSR was increased up to 70% compared to that without rotation. This clearly revealed that the centrifugal force was conducive to hydrogen production, resulting from intensifying mass transfer and elongating the path of the reactants in the catalyst bed. From Le Chatelier's principle, a higher reaction temperature or a lower steam/CO ratio disfavors CO conversion; however, under such a situation the enhancement of the centrifugal force on hydrogen production from the WGSR tended to become more significant. Accordingly, a correlation between the enhancement of CO conversion and the G number was established. As a whole, the higher the reaction temperature and the lower the steam/CO ratio, the higher the exponent of the G number function and the better the centrifugal force on the WGSR. (author)

  20. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    Science.gov (United States)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  1. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  2. Rate constant computation on some elementary reactions of Hg during combustion

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qing; Yang, Bo-wen; Bai, Jing-ru [Northeast Dianli Univ., Jilin (China). Inst. of Energy and Power Engineering

    2013-07-01

    The geometry optimizations of reactants, products and transition states were made by the quantum chemistry MP2 method at the SDD basis function level for Hg, and 6-311++G(3df, 3pd) for others. The properties of stable minimums were validated by vibration frequencies analysis. Furthermore, the microcosmic chemical reaction mechanisms of reactions were investigated by ab initio calculations of quantum chemistry. On the basis of the geometry optimization, reaction rate constants within 298-2,000 K are calculated neither from experimental data nor by estimated, but directly from Quantum Chemistry software-Khimera.

  3. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological Univ., Cookeville, TN (United States); Caldwell, Blake A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hicks, Susan Elaine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koch, Scott M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Naughton, III, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pelfrey, Daniel S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pogge, James R [Tennessee Technological Univ., Cookeville, TN (United States); Scott, Stephen L [Tennessee Technological Univ., Cookeville, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sorrillo, Lawrence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges for the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.

  4. Computational screening of doped αMnO2 catalystsfor the oxygen evolution reaction

    DEFF Research Database (Denmark)

    Tripkovic, Vladimir; Hansen, Heine Anton; Vegge, Tejs

    2018-01-01

    Minimizing energy and materials costs for driving the oxygen evolution reaction (OER) is paramount for the commercialization of water electrolysis cells and rechargeable metal-air batteries. Using density functional theory calculations, we analyze the structural stability, catalytic activity...

  5. Hydrolysis reaction of 2,4-dichlorophenoxyacetic acid. A kinetic and computational study

    Science.gov (United States)

    Romero, Jorge Marcelo; Jorge, Nelly Lidia; Grand, André; Hernández-Laguna, Alfonso

    2015-10-01

    The degradation of the 2,4-dichlorophenoxyacetic acid in aqueous solution is an hydrolysis reaction. Two products are identified: 2,4-dichlorophenol and glycolic acid. Reaction is investigated as a function of pH and temperature, and it is first-order kinetics and pH-dependent. Reaction is modeled in gas phase, where a proton catalyses the reaction. Critical points of PES are calculated at B3LYP/6-311++G(3df,2p), and aug-cc-pvqz//6-311++G(3df,2p) levels plus ZPE at 6-311++G(3df,2p) level. The activation barrier is 21.2 kcal/mol. Theoretical results agree with the experimental results. A second mechanism related with a Cl2Phsbnd Osbnd CH2sbnd COOH⋯H2O complex is found, but with a rate limiting step of 38.4 kcal/mol.

  6. Molecular simulations in electrochemistry : Electron and proton transfer reactions mediated by flavins in different molecular environments

    NARCIS (Netherlands)

    Kılıç, M.

    2014-01-01

    The aim of this thesis is to address specific questions about the role of solvent reorganization on electron transfer in different environments and about the calculation of acidity constant, as well. Particularly, we focus on molecular simulation of flavin in water and different protein (BLUF and

  7. Invited Reaction: Influences of Formal Learning, Personal Learning Orientation, and Supportive Learning Environment on Informal Learning

    Science.gov (United States)

    Cseh, Maria; Manikoth, Nisha N.

    2011-01-01

    As the authors of the preceding article (Choi and Jacobs, 2011) have noted, the workplace learning literature shows evidence of the complementary and integrated nature of formal and informal learning in the development of employee competencies. The importance of supportive learning environments in the workplace and of employees' personal learning…

  8. Students' Reaction to WebCT: Implications for Designing On-Line Learning Environments

    Science.gov (United States)

    Osman, Mohamed Eltahir

    2005-01-01

    There is a growing number of web-based and web-assisted course development tools and products that can be used to create on-line learning environment. The utility of these products, however, varies greatly depending on their feasibility, prerequisite infrastructure, technical features, interface, and course development and management tools. WebCT…

  9. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  10. How computational methods and relativistic effects influence the study of chemical reactions involving Ru-NO complexes?

    Science.gov (United States)

    Orenha, Renato Pereira; Santiago, Régis Tadeu; Haiduke, Roberto Luiz Andrade; Galembeck, Sérgio Emanuel

    2017-05-05

    Two treatments of relativistic effects, namely effective core potentials (ECP) and all-electron scalar relativistic effects (DKH2), are used to obtain geometries and chemical reaction energies for a series of ruthenium complexes in B3LYP/def2-TZVP calculations. Specifically, the reaction energies of reduction (A-F), isomerization (G-I), and Cl - negative trans influence in relation to NH 3 (J-L) are considered. The ECP and DKH2 approaches provided geometric parameters close to experimental data and the same ordering for energy changes of reactions A-L. From geometries optimized with ECP, the electronic energies are also determined by means of the same ECP and basis set combined with the computational methods: MP2, M06, BP86, and its derivatives, so as B2PLYP, LC-wPBE, and CCSD(T) (reference method). For reactions A-I, B2PLYP provides the best agreement with CCSD(T) results. Additionally, B3LYP gave the smallest error for the energies of reactions J-L. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  11. All-oxide Raman-active traps for light and matter: probing redox homeostasis model reactions in aqueous environment.

    Science.gov (United States)

    Alessandri, Ivano; Depero, L E

    2014-04-09

    Core-shell colloidal crystals can act as very efficient traps for light and analytes. Here it is shown that Raman-active probes can be achieved using SiO2-TiO2 core-shell beads. These systems are successfully tested in monitoring of glutathione redox cycle at physiological concentration in aqueous environment, without need of any interfering enhancers. These materials represent a promising alternative to conventional, metal-based SERS probes for investigating chemical and biochemical reactions under real working conditions. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. What is Eating Ozone? Thermal Reactions between SO2 And O3: Implications for Icy Environments

    Science.gov (United States)

    Loeffler, Mark J.; Hudson, Reggie L.

    2016-01-01

    Laboratory studies are presented, showing for the first time that thermally driven reactions in solid H2O+SO2+O3 mixtures can occur below 150 K, with the main sulfur-containing product being bisulfate (HSO4(-)). Using a technique not previously applied to the low-temperature kinetics of either interstellar or solar system ice analogs, we estimate an activation energy of 32 kJ per mol for HSO4(-) formation. These results show that at the temperatures of the Jovian satellites, SO2 and O3 will efficiently react making detection of these molecules in the same vicinity unlikely. Our results also explain why O3 has not been detected on Callisto and why the SO2 concentration on Callisto appears to be highest on that world's leading hemisphere. Furthermore, our results predict that the SO2 concentration on Ganymede will be lowest in the trailing hemisphere, where the concentration of O3 is the highest. Our work suggests that thermal reactions in ices play a much more important role in surface and sub-surface chemistry than generally appreciated, possibly explaining the low abundance of sulfur-containing molecules and the lack of ozone observed in comets and interstellar ices.

  13. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    Science.gov (United States)

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C.

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs). Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages. PMID:28883801

  14. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle.

    Science.gov (United States)

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs) . Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  15. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    Directory of Open Access Journals (Sweden)

    Rebeca Cerezo

    2017-08-01

    Full Text Available Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs. Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques.Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples.Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance.Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  16. Hydride Transfer versus Deprotonation Kinetics in the Isobutane–Propene Alkylation Reaction: A Computational Study

    Science.gov (United States)

    2017-01-01

    The alkylation of isobutane with light alkenes plays an essential role in modern petrochemical processes for the production of high-octane gasoline. In this study we have employed periodic DFT calculations combined with microkinetic simulations to investigate the complex reaction mechanism of isobutane–propene alkylation catalyzed by zeolitic solid acids. Particular emphasis was given to addressing the selectivity of the alkylate formation versus alkene formation, which requires a high rate of hydride transfer in comparison to the competitive oligomerization and deprotonation reactions resulting in catalyst deactivation. Our calculations reveal that hydride transfer from isobutane to a carbenium ion occurs via a concerted C–C bond formation between a tert-butyl fragment and an additional olefin, or via deprotonation of the tert-butyl fragment to generate isobutene. A combination of high isobutane concentration and low propene concentration at the reaction center favor the selective alkylation. The key reaction step that has to be suppressed to increase the catalyst lifetime is the deprotonation of carbenium intermediates that are part of the hydride transfer reaction cycle. PMID:29226012

  17. Hydride Transfer versus Deprotonation Kinetics in the Isobutane-Propene Alkylation Reaction: A Computational Study.

    Science.gov (United States)

    Liu, Chong; van Santen, Rutger A; Poursaeidesfahani, Ali; Vlugt, Thijs J H; Pidko, Evgeny A; Hensen, Emiel J M

    2017-12-01

    The alkylation of isobutane with light alkenes plays an essential role in modern petrochemical processes for the production of high-octane gasoline. In this study we have employed periodic DFT calculations combined with microkinetic simulations to investigate the complex reaction mechanism of isobutane-propene alkylation catalyzed by zeolitic solid acids. Particular emphasis was given to addressing the selectivity of the alkylate formation versus alkene formation, which requires a high rate of hydride transfer in comparison to the competitive oligomerization and deprotonation reactions resulting in catalyst deactivation. Our calculations reveal that hydride transfer from isobutane to a carbenium ion occurs via a concerted C-C bond formation between a tert -butyl fragment and an additional olefin, or via deprotonation of the tert -butyl fragment to generate isobutene. A combination of high isobutane concentration and low propene concentration at the reaction center favor the selective alkylation. The key reaction step that has to be suppressed to increase the catalyst lifetime is the deprotonation of carbenium intermediates that are part of the hydride transfer reaction cycle.

  18. Multi-VO support in IHEP's distributed computing environment

    International Nuclear Information System (INIS)

    Yan, T; Suo, B; Zhao, X H; Zhang, X M; Ma, Z T; Yan, X F; Lin, T; Deng, Z Y; Li, W D; Belov, S; Pelevanyuk, I; Zhemchugov, A; Cai, H

    2015-01-01

    Inspired by the success of BESDIRAC, the distributed computing environment based on DIRAC for BESIII experiment, several other experiments operated by Institute of High Energy Physics (IHEP), such as Circular Electron Positron Collider (CEPC), Jiangmen Underground Neutrino Observatory (JUNO), Large High Altitude Air Shower Observatory (LHAASO) and Hard X-ray Modulation Telescope (HXMT) etc, are willing to use DIRAC to integrate the geographically distributed computing resources available by their collaborations. In order to minimize manpower and hardware cost, we extended the BESDIRAC platform to support multi-VO scenario, instead of setting up a self-contained distributed computing environment for each VO. This makes DIRAC as a service for the community of those experiments. To support multi-VO, the system architecture of BESDIRAC is adjusted for scalability. The VOMS and DIRAC servers are reconfigured to manage users and groups belong to several VOs. A lightweight storage resource manager StoRM is employed as the central SE to integrate local and grid data. A frontend system is designed for user's massive job splitting, submission and management, with plugins to support new VOs. A monitoring and accounting system is also considered to easy the system administration and VO related resources usage accounting. (paper)

  19. The Effect of the Equatorial Environment on Oxo-Group Silylation of the Uranyl Dication: A Computational Study

    International Nuclear Information System (INIS)

    Yahia, A.; Maron, L.; Yahia, A.; Arnold, P.L.; Love, J.B.

    2010-01-01

    A theoretical investigation of the reductive oxo-group silylation reaction of the uranyl dication held in a Pacman macrocyclic environment has been carried out. The effect of the modeling of the Pacman ligand on the reaction profiles is found to be important, with the dipotassiation of a single oxo group identified as a key component in promoting the reaction between the Si-X and uranium-oxo bonds. This reductive silylation reaction is also proposed to occur in an aqueous environment but was found not to operate on bare ions; in this latter case, substitution of a ligand in the equatorial plane was the most likely reaction. These results demonstrate the importance of the presence but not the identity of the equatorial ligands upon the silylation of the uranyl U-O bond. (authors)

  20. Reactions and Surface Transformations of a Bone-Bioactive Material in a Simulated Microgravity Environment

    Science.gov (United States)

    Radin, S.; Ducheyne, P.; Ayyaswamy, P. S.

    1999-01-01

    A comprehensive program to investigate the expeditious in vitro formation of three-dimensional bone-like tissue is currently underway at the University of Pennsylvania. The study reported here forms a part of that program. Three-dimensional bone-like tissue structures may be grown under the simulated microgravity conditions of NASA designed Rotating Wall Bioreactor Vessels (RWV's). Such tissue growth will have wide clinical applications. In addition, an understanding of the fundamental changes that occur to bone cells under simulated microgravity would yield important information that will help in preventing or minimizing astronaut bone loss, a major health issue with travel or stay in space over long periods of time. The growth of three-dimensional bone-like tissue structures in RWV's is facilitated by the use of microcarriers which provide structural support. If the microcarrier material additionally promotes bone cell growth, then it is particularly advantageous to employ such microcarriers. We have found that reactive, bone-bioactive glass (BBG) is an attractive candidate for use as microcarrier material. Specifically, it has been found that BBG containing Ca- and P- oxides upregulates osteoprogenitor cells to osteoblasts. This effect on cells is preceded by BBG reactions in solution which result in the formation of a Ca-P surface layer. This surface further transforms to a bone-like mineral (i.e., carbonated crystalline hydroxyapatite (c-HA)). At normal gravity, time-dependent, immersion-induced BBG reactions and transformations are greatly affected both by variations in the composition of the milieu in which the glass is immersed and on the immersion conditions. However, the nature of BBG reactions and phase transformations under the simulated microgravity conditions of RWV's are unknown, and must be understood in order to successfully use BBG as microcarrier material in RWV'S. In this paper, we report some of our recent findings in this regard using

  1. Pulsed fusion space propulsion: Computational Magneto-Hydro Dynamics of a multi-coil parabolic reaction chamber

    Science.gov (United States)

    Romanelli, Gherardo; Mignone, Andrea; Cervone, Angelo

    2017-10-01

    Pulsed fusion propulsion might finally revolutionise manned space exploration by providing an affordable and relatively fast access to interplanetary destinations. However, such systems are still in an early development phase and one of the key areas requiring further investigations is the operation of the magnetic nozzle, the device meant to exploit the fusion energy and generate thrust. One of the last pulsed fusion magnetic nozzle design is the so called multi-coil parabolic reaction chamber: the reaction is thereby ignited at the focus of an open parabolic chamber, enclosed by a series of coaxial superconducting coils that apply a magnetic field. The field, beside confining the reaction and preventing any contact between hot fusion plasma and chamber structure, is also meant to reflect the explosion and push plasma out of the rocket. Reflection is attained thanks to electric currents induced in conductive skin layers that cover each of the coils, the change of plasma axial momentum generates thrust in reaction. This working principle has yet to be extensively verified and computational Magneto-Hydro Dynamics (MHD) is a viable option to achieve that. This work is one of the first detailed ideal-MHD analysis of a multi-coil parabolic reaction chamber of this kind and has been completed employing PLUTO, a freely distributed computational code developed at the Physics Department of the University of Turin. The results are thus a preliminary verification of the chamber's performance. Nonetheless, plasma leakage through the chamber structure has been highlighted. Therefore, further investigations are required to validate the chamber design. Implementing a more accurate physical model (e.g. Hall-MHD or relativistic-MHD) is thus mandatory, and PLUTO shows the capabilities to achieve that.

  2. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  3. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    International Nuclear Information System (INIS)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach

  4. A multi target approach to control chemical reactions in their inhomogeneous solvent environment

    International Nuclear Information System (INIS)

    Keefer, Daniel; Thallmair, Sebastian; Zauleck, Julius P P; Vivie-Riedle, Regina de

    2015-01-01

    Shaped laser pulses offer a powerful tool to manipulate molecular quantum systems. Their application to chemical reactions in solution is a promising concept to redesign chemical synthesis. Along this road, theoretical developments to include the solvent surrounding are necessary. An appropriate theoretical treatment is helpful to understand the underlying mechanisms. In our approach we simulate the solvent by randomly selected snapshots from molecular dynamics trajectories. We use multi target optimal control theory to optimize pulses for the various arrangements of explicit solvent molecules simultaneously. This constitutes a major challenge for the control algorithm, as the solvent configurations introduce a large inhomogeneity to the potential surfaces. We investigate how the algorithm handles the new challenges and how well the controllability of the system is preserved with increasing complexity. Additionally, we introduce a way to statistically estimate the efficiency of the optimized laser pulses in the complete thermodynamical ensemble. (paper)

  5. Subseafloor seawater-basalt-microbe reactions: Continuous sampling of borehole fluids in a ridge flank environment

    Science.gov (United States)

    Wheat, C. Geoffrey; Jannasch, Hans W.; Fisher, Andrew T.; Becker, Keir; Sharkey, Jessica; Hulme, Samuel

    2010-07-01

    Integrated Ocean Drilling Program (IODP) Hole 1301A was drilled, cased, and instrumented with a long-term, subseafloor observatory (CORK) on the eastern flank of the Juan de Fuca Ridge in summer 2004. This borehole is located 1 km south of ODP Hole 1026B and 5 km north of Baby Bare outcrop. Hole 1301A penetrates 262 m of sediment and 108 m of the uppermost 3.5 Ma basaltic basement in an area of warm (64°C) hydrothermal circulation. The borehole was instrumented, and those instruments were recovered 4 years later. Here we report chemical data from two continuous fluid samplers (OsmoSamplers) and temperature recording tools that monitored changes in the state of borehole (formation) fluids. These changes document the effects of drilling, fluid overpressure and flow, seawater-basalt interactions, and microbial metababolic activity. Initially, bottom seawater flowed into the borehole through a leak between concentric CORK casing strings. Eventually, the direction of flow reversed, and warm, altered formation fluid flowed into the borehole and discharged at the seafloor. This reversal occurred during 1 week in September 2007, 3 years after drilling operations ceased. The composition of the formation fluid around Hole 1301A generally lies within bounds defined by springs on Baby Bare outcrop (to the south) and fluids that discharged from Hole 1026B (to the north); deviations likely result from reactions with drilling products. Simple conservative mixing of two end-member fluids reveals reactions occurring within the crust, including nitrate reduction presumably by denitrifying microbes. The observed changes in borehole fluid composition provide the foundation for a conceptual model of chemical and microbial change during recharge of a warm ridge-flank hydrothermal system. This model can be tested through future scientific ocean drilling experiments.

  6. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  7. Computational Methods to Predict the Regioselectivity of Electrophilic Aromatic Substitution Reactions of Heteroaromatic Systems

    DEFF Research Database (Denmark)

    Kruszyk, Monika; Jessing, Mikkel; Kristensen, Jesper L

    2016-01-01

    The validity of calculated NMR shifts to predict the outcome of electrophilic aromatic substitution reactions on different heterocyclic compounds has been examined. Based on an analysis of >130 literature examples it was found that the lowest calculated 13C and/or 1H chemical shift of a heterocycle...... correlates qualitatively with the regiochemical outcome of halogenation reactions in >80% of the investigated cases. In the remaining cases, the site of electrophilic aromatic substitution can be explained by the calculated HOMO orbitals obtained using density functional theory. Using a combination...

  8. Attitudes and gender differences of high school seniors within one-to-one computing environments in South Dakota

    Science.gov (United States)

    Nelson, Mathew

    In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.

  9. Scheduling Method of Data-Intensive Applications in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Xiong Fu

    2015-01-01

    Full Text Available The virtualization of cloud computing improves the utilization of resources and energy. And a cloud user can deploy his/her own applications and related data on a pay-as-you-go basis. The communications between an application and a data storage node, as well as within the application, have a great impact on the execution efficiency of the application. The locations of subtasks of an application and the data that transferred between the subtasks are the main reason why communication delay exists. The communication delay can affect the completion time of the application. In this paper, we take into account the data transmission time and communications between subtasks and propose a heuristic optimal virtual machine (VM placement algorithm. Related simulations demonstrate that this algorithm can reduce the completion time of user tasks and ensure the feasibility and effectiveness of the overall network performance of applications when running in a cloud computing environment.

  10. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  11. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  12. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  13. Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments

    Directory of Open Access Journals (Sweden)

    Jose M. Moya

    2012-08-01

    Full Text Available Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  14. Ubiquitous green computing techniques for high demand applications in Smart environments.

    Science.gov (United States)

    Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L

    2012-01-01

    Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  15. Integration of a browser based operator manual in the system environment of a process computer system

    International Nuclear Information System (INIS)

    Weber, Andreas; Erfle, Robert; Feinkohl, Dirk

    2012-01-01

    The integration of a browser based operator manual in the system environment of a process computer system is an optimization of the operating procedure in the control room and a safety enhancement due to faster and error-free access to the manual contents. Several requirements by the authorities have to be fulfilled: the operating manual has to be available as hard copy, the format has to be true to original, protection against manipulation has to be provided, the manual content of the browser-based version and the hard copy have to identical, and the display presentation has to be consistent with ergonomic principals. The integration of the on-line manual in the surveillance process computer system provides the operator with the relevant comments to the surveillance signal. The described integration of the on-line manual is an optimization of the operator's everyday job with respect to ergonomics and safety (human performance).

  16. Aeroflex Single Board Computers and Instrument Circuit Cards for Nuclear Environments Measuring and Monitoring

    International Nuclear Information System (INIS)

    Stratton, Sam; Stevenson, Dave; Magnifico, Mateo

    2013-06-01

    A Single Board Computer (SBC) is an entire computer including all of the required components and I/O interfaces built on a single circuit board. SBC's are used across numerous industrial, military and space flight applications. In the case of military and space implementations, SBC's employ advanced high reliability processors designed for rugged thermal, mechanical and even radiation environments. These processors, in turn, rely on equally advanced support components such as memory, interface, and digital logic. When all of these components are put together on a printed circuit card, the result is a highly reliable Single Board Computer that can perform a wide variety of tasks in very harsh environments. In the area of instrumentation, peripheral circuit cards can be developed that directly interface to the SBC and various radiation measuring devices and systems. Designers use signal conditioning and high reliability Analog to Digital Converters (ADC's) to convert the measuring device signals to digital data suitable for a microprocessor. The data can then be sent to the SBC via high speed communication protocols such as Ethernet or similar type of serial bus. Data received by the SBC can then be manipulated and processed into a form readily available to users. Recent events are causing some in the NPP industry to consider devices and systems with better radiation and temperature performance capability. Systems designed for space application are designed for the harsh environment of space which under certain conditions would be similar to what the electronics will see during a severe nuclear reactor event. The NPP industry should be considering higher reliability electronics for certain critical applications. (authors)

  17. One–pot synthesis and electrochemical properties of polyaniline nanofibers through simply tuning acid–base environment of reaction medium

    International Nuclear Information System (INIS)

    Li, Tao; Zhou, Yi; Liang, Banglei; Jin, Dandan; Liu, Na; Qin, Zongyi; Zhu, Meifang

    2017-01-01

    Highlights: •Presenting a facile one–pot approach to prepare polyaniline nanofibers through simply tuning acid–base environment of reaction medium. •Determining the role of aniline oligomers play in the formation of polyaniline nanofibers. •Demonstrating the feasibility of polyaniline nanofibers as high–performance electrode materials for supercapacitors. -- Abstract: A facile and efficient one–pot approach was presented to prepare polyaniline (PANi) nanofibers through simply tuning acid–base environment of reaction medium without the assistance of templates or use of organic solvents, in which aniline oligomers formed in the alkaline solution were used as “seeds” for the oriented growth of PANi chains under acidic conditions. The as–prepared PANi nanofibers were investigated by field–emission scanning electron microscopy, ultraviolet–visible spectroscopy, Fourier transform infrared spectroscopy and X–ray diffraction technology. Furthermore, the electrochemical properties were evaluated by cyclic voltammetry, galvanostatic charge–discharge test, and electrochemical impedance spectroscopy. More attentions were paid to the influence of aniline concentrations in alkaline and acidic reaction medium on the morphology, microstructure and properties of PANi nanofibers. It can be found that aniline concentration in alkaline medium has a stronger impact on the electrical and electrochemical properties of final products, however, their morphologies obviously depend on aniline concentration in acidic solution. Moreover, PANi nanofibers prepared at aniline concentrations of 48 mM in alkaline medium and 0.2 M in acidic medium exhibits the largest specific capacitance of 857.2 F g −1 at the scan rate of 5 mV s −1 , and capacitance retention of 63.8% after 500 cycles. It is demonstrated that such one–pot approach can present a low cost and environmental friendly route to fabricate PANi nanofibers in fully aqueous solution as high

  18. Computing molecular fluctuations in biochemical reaction systems based on a mechanistic, statistical theory of irreversible processes.

    Science.gov (United States)

    Kulasiri, Don

    2011-01-01

    We discuss the quantification of molecular fluctuations in the biochemical reaction systems within the context of intracellular processes associated with gene expression. We take the molecular reactions pertaining to circadian rhythms to develop models of molecular fluctuations in this chapter. There are a significant number of studies on stochastic fluctuations in intracellular genetic regulatory networks based on single cell-level experiments. In order to understand the fluctuations associated with the gene expression in circadian rhythm networks, it is important to model the interactions of transcriptional factors with the E-boxes in the promoter regions of some of the genes. The pertinent aspects of a near-equilibrium theory that would integrate the thermodynamical and particle dynamic characteristics of intracellular molecular fluctuations would be discussed, and the theory is extended by using the theory of stochastic differential equations. We then model the fluctuations associated with the promoter regions using general mathematical settings. We implemented ubiquitous Gillespie's algorithms, which are used to simulate stochasticity in biochemical networks, for each of the motifs. Both the theory and the Gillespie's algorithms gave the same results in terms of the time evolution of means and variances of molecular numbers. As biochemical reactions occur far away from equilibrium-hence the use of the Gillespie algorithm-these results suggest that the near-equilibrium theory should be a good approximation for some of the biochemical reactions. © 2011 Elsevier Inc. All rights reserved.

  19. Hydride transfer versus deprotonation kinetics in the isobutane−propene alkylation reaction : A computational study

    NARCIS (Netherlands)

    Liu, Chong; Van Santen, Rutger A.; Poursaeidesfahani, A.; Vlugt, T.J.H.; Pidko, Evgeny A.; Hensen, Emiel J M

    2017-01-01

    The alkylation of isobutane with light alkenes plays an essential role in modern petrochemical processes for the production of high-octane gasoline. In this study we have employed periodic DFT calculations combined with microkinetic simulations to investigate the complex reaction mechanism of

  20. Young children reorient by computing layout geometry, not by matching images of the environment.

    Science.gov (United States)

    Lee, Sang Ah; Spelke, Elizabeth S

    2011-02-01

    Disoriented animals from ants to humans reorient in accord with the shape of the surrounding surface layout: a behavioral pattern long taken as evidence for sensitivity to layout geometry. Recent computational models suggest, however, that the reorientation process may not depend on geometrical analyses but instead on the matching of brightness contours in 2D images of the environment. Here we test this suggestion by investigating young children's reorientation in enclosed environments. Children reoriented by extremely subtle geometric properties of the 3D layout: bumps and ridges that protruded only slightly off the floor, producing edges with low contrast. Moreover, children failed to reorient by prominent brightness contours in continuous layouts with no distinctive 3D structure. The findings provide evidence that geometric layout representations support children's reorientation.

  1. Educational Game Design. Bridging the gab between computer based learning and experimental learning environments

    DEFF Research Database (Denmark)

    Andersen, Kristine

    2007-01-01

    Considering the rapidly growing amount of digital educational materials only few of them bridge the gab between experimental learning environments and computer based learning environments (Gardner, 1991). Observations from two cases in primary school and lower secondary school in the subject...... with a prototype of a MOO storyline. The aim of the MOO storyline is to challenge the potential of dialogue, user involvement, and learning responsibility and to use the children?s natural curiosity and motivation for game playing, especially when digital games involves other children. The paper proposes a model......, based on the narrative approach for experimental learning subjects, relying on ideas from Csikszentmihalyis notion of flow (Csikszentmihalyi, 1991), storyline-pedagogy (Meldgaard, 1994) and ideas from Howard Gardner (Gardner, 1991). The model forms the basis for educational games to be used in home...

  2. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  3. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    Science.gov (United States)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  4. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  5. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  6. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  7. Predictor - Predictive Reaction Design via Informatics, Computation and Theories of Reactivity

    Science.gov (United States)

    2017-10-10

    Informatics, Computation and Theories of Reactivity Report Term: 0-Other Email : djtantillo@ucdavis.edu Distribution Statement: 1-Approved for public...Principal: Y Name: Dean J. Tantillo Email : djtantillo@ucdavis.edu RPPR Final Report as of 24-Nov-2017 Honors and Awards: Nothing to Report Protocol...meaningful queries is finding a balance between the amount of details in the metadata and computed results stored in the database vs. writing data

  8. A simple interface to computational fluid dynamics programs for building environment simulations

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, III, C R; Chen, Q [Massachusetts Institute of Technology, Cambridge, MA (United States)

    2000-07-01

    It is becoming a popular practice for architects and HVAC engineers to simulate airflow in and around buildings by computational fluid dynamics (CFD) methods in order to predict indoor and outdoor environment. However, many CFD programs are crippled by a historically poor and inefficient user interface system, particularly for users with little training in numerical simulation. This investigation endeavors to create a simplified CFD interface (SCI) that allows architects and buildings engineers to use CFD without excessive training. The SCI can be easily integrated into new CFD programs. (author)

  9. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  10. Addition reaction of adamantylideneadamantane with Br2 and 2Br2: a computational study.

    Science.gov (United States)

    Islam, Shahidul M; Poirier, Raymond A

    2008-01-10

    Ab initio calculations were carried out for the reaction of adamantylideneadamantane (Ad=Ad) with Br2 and 2Br2. Geometries of the reactants, transition states, intermediates, and products were optimized at HF and B3LYP levels of theory using the 6-31G(d) basis set. Energies were also obtained using single point calculations at the MP2/6-31G(d)//HF/6-31G(d), MP2/6-31G(d)//B3LYP/6-31G(d), and B3LYP/6-31+G(d)//B3LYP/6-31G(d) levels of theory. Intrinsic reaction coordinate (IRC) calculations were performed to characterize the transition states on the potential energy surface. Only one pathway was found for the reaction of Ad=Ad with one Br2 producing a bromonium/bromide ion pair. Three mechanisms for the reaction of Ad=Ad with 2Br2 were found, leading to three different structural forms of the bromonium/Br3- ion pair. Activation energies, free energies, and enthalpies of activation along with the relative stability of products for each reaction pathway were calculated. The reaction of Ad=Ad with 2Br2 was strongly favored over the reaction with only one Br2. According to B3LYP/6-31G(d) and single point calculations at MP2, the most stable bromonium/Br3- ion pair would form spontaneously. The most stable of the three bromonium/Br3- ion pairs has a structure very similar to the observed X-ray structure. Free energies of activation and relative stabilities of reactants and products in CCl4 and CH2ClCH2Cl were also calculated with PCM using the united atom (UA0) cavity model and, in general, results similar to the gas phase were obtained. An optimized structure for the trans-1,2-dibromo product was also found at all levels of theory both in gas phase and in solution, but no transition state leading to the trans-1,2-dibromo product was obtained.

  11. Effect of micellar environment on Marcus correlation curves for photoinduced bimolecular electron transfer reactions

    Science.gov (United States)

    Kumbhakar, Manoj; Nath, Sukhendu; Mukherjee, Tulsi; Pal, Haridas

    2005-07-01

    Photoinduced electron transfer (ET) between coumarin dyes and aromatic amine has been investigated in two cationic micelles, namely, cetyltrimethyl ammonium bromide (CTAB) and dodecyltrimethyl ammonium bromide (DTAB), and the results have been compared with those observed earlier in sodium dodecyl sulphate (SDS) and triton-X-100 (TX-100) micelles for similar donor-acceptor pairs. Due to a reasonably high effective concentration of the amines in the micellar Stern layer, the steady-state fluorescence results show significant static quenching. In the time-resolved (TR) measurements with subnanosecond time resolution, contribution from static quenching is avoided. Correlations of the dynamic quenching constants (kqTR), as estimated from the TR measurements, show the typical bell-shaped curves with the free-energy changes (ΔG0) of the ET reactions, as predicted by the Marcus outersphere ET theory. Comparing present results with those obtained earlier for similar coumarin-amine systems in SDS and TX-100 micelles, it is seen that the inversion in the present micelles occurs at an exergonicity (-ΔG0>˜1.2-1.3eV) much higher than that observed in SDS and TX-100 micelles (-ΔG0>˜0.7eV), which has been rationalized based on the relative propensities of the ET and solvation rates in different micelles. In CTAB and DTAB micelles, the kqTR values are lower than the solvation rates, which result in the full contribution of the solvent reorganization energy (λs) towards the activation barrier for the ET reaction. Contrary to this, in SDS and TX-100 micelles, kqTR values are either higher or comparable with the solvation rates, causing only a partial contribution of λs in these cases. Thus, Marcus inversion in present cationic micelles is inferred to be the true inversion, whereas that in the anionic SDS and neutral TX-100 micelles are understood to be the apparent inversion, as envisaged from two-dimensional ET theory.

  12. A mixed-methods exploration of an environment for learning computer programming

    Directory of Open Access Journals (Sweden)

    Richard Mather

    2015-08-01

    Full Text Available A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches used for the requirements engineering of computing systems are combined with questionnaire-based feedback and skill tests. These are applied to the ‘Ceebot’ animated 3D learning environment. Video analysis with workplace observation allowed detailed inspection of problem solving and tacit behaviours. Questionnaires and knowledge tests provided broad sample coverage with insights into subject understanding and overall response to the learning environment. Although relatively low scores in programming tests seemingly contradicted the perception that Ceebot had enhanced understanding of programming, this perception was nevertheless found to be correlated with greater test performance. Video analysis corroborated findings that the learning environment and Ceebot animations were engaging and encouraged constructive collaborative behaviours. Ethnographic observations clearly captured Ceebot's value in providing visual cues for problem-solving discussions and for progress through sharing discoveries. Notably, performance in tests was most highly correlated with greater programming practice (p≤0.01. It was apparent that although students had appropriated technology for collaborative working and benefitted from visual and tacit cues provided by Ceebot, they had not necessarily deeply learned the lessons intended. The key value of the ‘mixed-methods’ approach was that ethnographic observations captured the authenticity of learning behaviours, and thereby strengthened confidence in the interpretation of questionnaire and test findings.

  13. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  14. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    Science.gov (United States)

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  15. Hybrid quantum and classical methods for computing kinetic isotope effects of chemical reactions in solutions and in enzymes.

    Science.gov (United States)

    Gao, Jiali; Major, Dan T; Fan, Yao; Lin, Yen-Lin; Ma, Shuhua; Wong, Kin-Yiu

    2008-01-01

    A method for incorporating quantum mechanics into enzyme kinetics modeling is presented. Three aspects are emphasized: 1) combined quantum mechanical and molecular mechanical methods are used to represent the potential energy surface for modeling bond forming and breaking processes, 2) instantaneous normal mode analyses are used to incorporate quantum vibrational free energies to the classical potential of mean force, and 3) multidimensional tunneling methods are used to estimate quantum effects on the reaction coordinate motion. Centroid path integral simulations are described to make quantum corrections to the classical potential of mean force. In this method, the nuclear quantum vibrational and tunneling contributions are not separable. An integrated centroid path integral-free energy perturbation and umbrella sampling (PI-FEP/UM) method along with a bisection sampling procedure was summarized, which provides an accurate, easily convergent method for computing kinetic isotope effects for chemical reactions in solution and in enzymes. In the ensemble-averaged variational transition state theory with multidimensional tunneling (EA-VTST/MT), these three aspects of quantum mechanical effects can be individually treated, providing useful insights into the mechanism of enzymatic reactions. These methods are illustrated by applications to a model process in the gas phase, the decarboxylation reaction of N-methyl picolinate in water, and the proton abstraction and reprotonation process catalyzed by alanine racemase. These examples show that the incorporation of quantum mechanical effects is essential for enzyme kinetics simulations.

  16. Hybrid Quantum Mechanics/Molecular Mechanics Solvation Scheme for Computing Free Energies of Reactions at Metal-Water Interfaces.

    Science.gov (United States)

    Faheem, Muhammad; Heyden, Andreas

    2014-08-12

    We report the development of a quantum mechanics/molecular mechanics free energy perturbation (QM/MM-FEP) method for modeling chemical reactions at metal-water interfaces. This novel solvation scheme combines planewave density function theory (DFT), periodic electrostatic embedded cluster method (PEECM) calculations using Gaussian-type orbitals, and classical molecular dynamics (MD) simulations to obtain a free energy description of a complex metal-water system. We derive a potential of mean force (PMF) of the reaction system within the QM/MM framework. A fixed-size, finite ensemble of MM conformations is used to permit precise evaluation of the PMF of QM coordinates and its gradient defined within this ensemble. Local conformations of adsorbed reaction moieties are optimized using sequential MD-sampling and QM-optimization steps. An approximate reaction coordinate is constructed using a number of interpolated states and the free energy difference between adjacent states is calculated using the QM/MM-FEP method. By avoiding on-the-fly QM calculations and by circumventing the challenges associated with statistical averaging during MD sampling, a computational speedup of multiple orders of magnitude is realized. The method is systematically validated against the results of ab initio QM calculations and demonstrated for C-C cleavage in double-dehydrogenated ethylene glycol on a Pt (111) model surface.

  17. Dealing Collectively with Critical Incident Stress Reactions in High Risk Work Environments

    DEFF Research Database (Denmark)

    Müller-Leonhardt, Alice; Strøbæk, Pernille Solveig; Vogt, joachim

    2015-01-01

    organisations. Indeed, we found that the CISM programme once integrated within the socio-cultural patterns of this specific working environment enhanced not only individual feelings of being supported but also organisational safety culture. Keywords: coping; safety culture; critical incident stress management......aim of this paper is to shift the representation of coping patterns within high risk occupations to an existential part of cultural pattern and social structure, which characterises high reliability organisations. Drawing upon the specific peer model of critical incident stress management (CISM......), in which qualified operational peers support colleagues who experienced critical incident stress, the paper discusses critical incident stress management in air traffic control. Our study revealed coping patterns that co-vary with the culture that the CISM programme fostered within this specific high...

  18. VAT: a computational framework to functionally annotate variants in personal genomes within a cloud-computing environment.

    Science.gov (United States)

    Habegger, Lukas; Balasubramanian, Suganthi; Chen, David Z; Khurana, Ekta; Sboner, Andrea; Harmanci, Arif; Rozowsky, Joel; Clarke, Declan; Snyder, Michael; Gerstein, Mark

    2012-09-01

    The functional annotation of variants obtained through sequencing projects is generally assumed to be a simple intersection of genomic coordinates with genomic features. However, complexities arise for several reasons, including the differential effects of a variant on alternatively spliced transcripts, as well as the difficulty in assessing the impact of small insertions/deletions and large structural variants. Taking these factors into consideration, we developed the Variant Annotation Tool (VAT) to functionally annotate variants from multiple personal genomes at the transcript level as well as obtain summary statistics across genes and individuals. VAT also allows visualization of the effects of different variants, integrates allele frequencies and genotype data from the underlying individuals and facilitates comparative analysis between different groups of individuals. VAT can either be run through a command-line interface or as a web application. Finally, in order to enable on-demand access and to minimize unnecessary transfers of large data files, VAT can be run as a virtual machine in a cloud-computing environment. VAT is implemented in C and PHP. The VAT web service, Amazon Machine Image, source code and detailed documentation are available at vat.gersteinlab.org.

  19. Development of computer code on sodium-water reaction products transport

    International Nuclear Information System (INIS)

    Arikawa, H.; Yoshioka, N.; Suemori, M.; Nishida, K.

    1988-01-01

    The LMFBR concept eliminating the secondary sodium system has been considered to be one of the most promissing concepts for offering cost reductions. In this reactor concept, the evaluation of effects on reactor core by the sodium-water reaction products (SWRPs) during sodium-water reaction at primary steam generator becomes one of the major safety issues. In this study, the calculation code was developed as the first step of the processes of establishing the evaluation method for SWRP effects. The calculation code, called SPROUT, simulates the SWRPs transport and distribution in primary sodium system using the system geometry, thermal hydraulic data and sodium-water reacting conditions as input. This code principally models SWRPs behavior. The paper contain the modelings for SWRPs behaviors, with solution, precipation, deposition and so on, and the results and discussions of the demonstration calculation for a typical FBR plant eliminating the secondary sodium system

  20. The Influence of Trainee Gaming Experience and Computer Self-Efficacy on Learner Outcomes of Videogame-Based Learning Environments

    National Research Council Canada - National Science Library

    Orvis, Karin A; Orvis, Kara L; Belanich, James; Mullin, Laura N

    2005-01-01

    .... The purpose of the current research was to investigate the influence of two trainee characteristics, prior videogame experience and computer self-efficacy, on learner outcomes of a videogame-based training environment...

  1. Hydride Transfer versus Deprotonation Kinetics in the Isobutane–Propene Alkylation Reaction: A Computational Study

    OpenAIRE

    Liu, Chong; van Santen, Rutger A.; Poursaeidesfahani, Ali; Vlugt, Thijs J. H.; Pidko, Evgeny A.; Hensen, Emiel J. M.

    2017-01-01

    The alkylation of isobutane with light alkenes plays an essential role in modern petrochemical processes for the production of high-octane gasoline. In this study we have employed periodic DFT calculations combined with microkinetic simulations to investigate the complex reaction mechanism of isobutane–propene alkylation catalyzed by zeolitic solid acids. Particular emphasis was given to addressing the selectivity of the alkylate formation versus alkene formation, which requires a high rate o...

  2. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-01-01

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment. PMID:28629131

  3. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  4. Characteristics of Israeli School Teachers in Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Noga Magen-Nagar

    2013-01-01

    Full Text Available The purpose of this research is to investigate whether there are differences in the level of computer literacy, the amount of implementation of ICT in teaching and learning-assessment processes and the attitudes of teachers from computerized schools in comparison to teachers in non-computerized schools. In addition, the research investigates the characteristics of Israeli school teachers in a 21st century computer-based learning environment. A quantitative research methodology was used. The research sample included 811 elementary school teachers from the Jewish sector of whom 402 teachers were from the computerized school sample and 409 were teachers from the non-computerized school sample. The research findings show that teachers from the computerized school sample are more familiar with ICT, tend to use ICT more and have a more positive attitude towards ICT than teachers in the non-computerized school sample. The main conclusion which can be drawn from this research is that positive attitudes of teachers towards ICT are not sufficient for the integration of technology to occur. Future emphasis on new teaching skills of collective Technological Pedagogical Content Knowledge is necessary to promote the implementation of optimal pedagogy in innovative environments.

  5. Development of a computational environment for the General Curvilinear Ocean Model

    International Nuclear Information System (INIS)

    Thomas, Mary P; Castillo, Jose E

    2009-01-01

    The General Curvilinear Ocean Model (GCOM) differs significantly from the traditional approach, where the use of Cartesian coordinates forces the model to simulate terrain as a series of steps. GCOM utilizes a full three-dimensional curvilinear transformation, which has been shown to have greater accuracy than similar models and to achieve results more efficiently. The GCOM model has been validated for several types of water bodies, different coastlines and bottom shapes, including the Alarcon Seamount, Southern California Coastal Region, the Valencia Lake in Venezuela, and more recently the Monterey Bay. In this paper, enhancements to the GCOM model and an overview of the computational environment (GCOM-CE) are presented. Model improvements include migration from F77 to F90; approach to a component design; and initial steps towards parallelization of the model. Through the use of the component design, new models are being incorporated including biogeochemical, pollution, and sediment transport. The computational environment is designed to allow various client interactions via secure Web applications (portal, Web services, and Web 2.0 gadgets). Features include building jobs, managing and interacting with long running jobs; managing input and output files; quick visualization of results; publishing of Web services to be used by other systems such as larger climate models. The CE is based mainly on Python tools including a grid-enabled Pylons Web application Framework for Web services, pyWSRF (python-Web Services-Resource Framework), pyGlobus based web services, SciPy, and Google code tools.

  6. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment.

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-06-17

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment.

  7. Computer simulation of the formation of tweed and modulated structures in decomposition reactions

    International Nuclear Information System (INIS)

    Chen, S.; Morris, J.W. Jr.; Khachaturyan, A.G.

    1979-03-01

    A model of coarsening in a heterogeneous cubic alloy with cubic or tetragonal precipitates is proposed. According to the model the coarsening is controlled by the relaxation of the elastic strain energy. The computer simulation of coarsening demonstrates good agreement with electron microscopic observation of the structure and diffraction pattern

  8. A Computational Experiment of the Endo versus Exo Preference in a Diels-Alder Reaction

    Science.gov (United States)

    Rowley, Christopher N.; Woo, Tom K.

    2009-01-01

    We have developed and tested a computational laboratory that investigates an endo versus exo Diels-Alder cycloaddition. This laboratory employed density functional theory (DFT) calculations to study the cycloaddition of N-phenylmaleimide to furan. The endo and exo stereoisomers of the product were distinguished by building the two isomers in a…

  9. Computer simulation of the formation of ''tweed'' and modulated structures in decomposition reactions

    International Nuclear Information System (INIS)

    Chen, S.; Morris, J.W. Jr.; Khachaturyan, A.G.

    1979-01-01

    A model of coarsening in a heterogeneous cubic alloy with cubic or tetragonal precipitates is proposed. According to the model the coarsening is controlled by the relaxation of the elastic strain energy. The computer simulation of coarsening demonstrates good agreement with electron microscopic observation of the structure and diffraction pattern

  10. A computational cognitive model for political positioning and reactions in web media

    NARCIS (Netherlands)

    Fernandes de Mello Araujo, E.; Klein, Michel

    2017-01-01

    This paper presents a computational cognitive model about political positioning inspired on recent insights from neuroscience and psychology. We describe a model that takes into consideration the individual structures of the brain and the environmental influences that may interfere on how a

  11. Kinetics of the high-temperature combustion reactions of dibutylether using composite computational methods

    KAUST Repository

    Rachidi, Mariam El; Davis, Alexander C.; Sarathy, Mani

    2015-01-01

    . The energetics of H-abstraction by OH radicals is also studied. All rates are determined computationally using the CBS-QB3 and G4 composite methods in conjunction with conventional transition state theory. The B3LYP/6-311++G(2df,2pd) method is used to optimize

  12. Aerosol transport simulations in indoor and outdoor environments using computational fluid dynamics (CFD)

    Science.gov (United States)

    Landazuri, Andrea C.

    This dissertation focuses on aerosol transport modeling in occupational environments and mining sites in Arizona using computational fluid dynamics (CFD). The impacts of human exposure in both environments are explored with the emphasis on turbulence, wind speed, wind direction and particle sizes. Final emissions simulations involved the digitalization process of available elevation contour plots of one of the mining sites to account for realistic topographical features. The digital elevation map (DEM) of one of the sites was imported to COMSOL MULTIPHYSICSRTM for subsequent turbulence and particle simulations. Simulation results that include realistic topography show considerable deviations of wind direction. Inter-element correlation results using metal and metalloid size resolved concentration data using a Micro-Orifice Uniform Deposit Impactor (MOUDI) under given wind speeds and directions provided guidance on groups of metals that coexist throughout mining activities. Groups between Fe-Mg, Cr-Fe, Al-Sc, Sc-Fe, and Mg-Al are strongly correlated for unrestricted wind directions and speeds, suggesting that the source may be of soil origin (e.g. ore and tailings); also, groups of elements where Cu is present, in the coarse fraction range, may come from mechanical action mining activities and saltation phenomenon. Besides, MOUDI data under low wind speeds (Computational Fluid Dynamics can be used as a source apportionment tool to identify areas that have an effect over specific sampling points and susceptible regions under certain meteorological conditions, and these conclusions can be supported with inter-element correlation matrices and lead isotope analysis, especially since there is limited access to the mining sites. Additional results concluded that grid adaption is a powerful tool that allows to refine specific regions that require lots of detail and therefore better resolve flow detail, provides higher number of locations with monotonic convergence than the

  13. Long-term changes of information environments and computer anxiety of nurse administrators in Japan.

    Science.gov (United States)

    Majima, Yukie; Izumi, Takako

    2013-01-01

    In Japan, medical information systems, including electronic medical records, are being introduced increasingly at medical and nursing fields. Nurse administrators, who are involved in the introduction of medical information systems and who must make proper judgment, are particularly required to have at least minimal knowledge of computers and networks and the ability to think about easy-to-use medical information systems. However, few of the current generation of nurse administrators studied information science subjects in their basic education curriculum. It can be said that information education for nurse administrators has become a pressing issue. Consequently, in this study, we conducted a survey of participants taking the first level program of the education course for Japanese certified nurse administrators to ascertain the actual conditions, such as the information environments that nurse administrators are in, their anxiety attitude to computers. Comparisons over the seven years since 2004 revealed that although introduction of electronic medical records in hospitals was progressing, little change in attributes of participants taking the course was observed, such as computer anxiety.

  14. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks.

    Science.gov (United States)

    Devi, D Chitra; Uthariaraj, V Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM's multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM), the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  15. The Case for Higher Computational Density in the Memory-Bound FDTD Method within Multicore Environments

    Directory of Open Access Journals (Sweden)

    Mohammed F. Hadi

    2012-01-01

    Full Text Available It is argued here that more accurate though more compute-intensive alternate algorithms to certain computational methods which are deemed too inefficient and wasteful when implemented within serial codes can be more efficient and cost-effective when implemented in parallel codes designed to run on today's multicore and many-core environments. This argument is most germane to methods that involve large data sets with relatively limited computational density—in other words, algorithms with small ratios of floating point operations to memory accesses. The examples chosen here to support this argument represent a variety of high-order finite-difference time-domain algorithms. It will be demonstrated that a three- to eightfold increase in floating-point operations due to higher-order finite-differences will translate to only two- to threefold increases in actual run times using either graphical or central processing units of today. It is hoped that this argument will convince researchers to revisit certain numerical techniques that have long been shelved and reevaluate them for multicore usability.

  16. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    Directory of Open Access Journals (Sweden)

    D. Chitra Devi

    2016-01-01

    Full Text Available Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s multiple cores. Also, the jobs arrive during the run time of the server in varying random intervals under various load conditions. The participating heterogeneous resources are managed by allocating the tasks to appropriate resources by static or dynamic scheduling to make the cloud computing more efficient and thus it improves the user satisfaction. Objective of this work is to introduce and evaluate the proposed scheduling and load balancing algorithm by considering the capabilities of each virtual machine (VM, the task length of each requested job, and the interdependency of multiple tasks. Performance of the proposed algorithm is studied by comparing with the existing methods.

  17. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  18. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach.

  19. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  20. Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis

    NARCIS (Netherlands)

    van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria

    2015-01-01

    In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results

  1. A computational study of the Diels Alder reactions involving acenes: reactivity and aromaticity

    Science.gov (United States)

    Cheng, Mei-Fun; Li, Wai-Kee

    2003-01-01

    Ab initio and DFT methods have been used to study the Diels-Alder reactivity and the aromaticity of four linear acenes, namely, naphthalene, anthracene, tetracene and pentacene. In total, eight additional pathways between ethylene and four acenes have been studied and all of them are concerted and exothermic reactions. It is found that the most reactive sites on the acenes are the center ring's meso-carbons. Also, reactivity decreases along the series pentacene > tetracene > anthracene > naphthalene. In addition, the NICS results indicate that the most reactive rings in the acenes are those with the highest aromaticity. These results are consistent with those of other theoretical studies and experiments.

  2. Competitive cation binding computations of proton balance for reactions of the phosphagen and glycolytic energy systems within skeletal muscle

    Science.gov (United States)

    2017-01-01

    Limited research and data has been published for the H+ coefficients for the metabolites and reactions involved in non-mitochondrial energy metabolism. The purpose of this investigation was to compute the fractional binding of H+, K+, Na+ and Mg2+ to 21 metabolites of skeletal muscle non-mitochondrial energy metabolism, resulting in 104 different metabolite-cation complexes. Fractional binding of H+ to these metabolite-cation complexes were applied to 17 reactions of skeletal muscle non-mitochondrial energy metabolism, and 8 conditions of the glycolytic pathway based on the source of substrate (glycogen vs. glucose), completeness of glycolytic flux, and the end-point of pyruvate vs. lactate. For pH conditions of 6.0 and 7.0, respectively, H+ coefficients (-‘ve values = H+ release) for the creatine kinase, adenylate kinase, AMP deaminase and ATPase reactions were 0.8 and 0.97, -0.13 and -0.02, 1.2 and 1.09, and -0.01 and -0.66, respectively. The glycolytic pathway is net H+ releasing, regardless of lactate production, which consumes 1 H+. For glycolysis fueled by glycogen and ending in either pyruvate or lactate, H+ coefficients for pH 6.0 and 7.0 were -3.97 and -2.01 (pyruvate), and -1.96 and -0.01 (lactate), respectively. When starting with glucose, the same conditions result in H+ coefficients of -3.98 and -2.67, and -1.97 and –0.67, respectively. The most H+ releasing reaction of glycolysis is the glyceraldehyde-3-phosphate dehydrogenase reaction, with H+ coefficients for pH 6.0 and 7.0 of -1.58 and -0.76, respectively. Incomplete flux of substrate through glycolysis would increase net H+ release due to the absence of the pyruvate kinase and lactate dehydrogenase reactions, which collectively result in H+ coefficients for pH 6.0 and 7.0 of 1.35 and 1.88, respectively. The data presented provide an extensive reference source for academics and researchers to accurately profile the balance of protons for all metabolites and reactions of non-mitochondrial energy

  3. Competitive cation binding computations of proton balance for reactions of the phosphagen and glycolytic energy systems within skeletal muscle.

    Science.gov (United States)

    Robergs, Robert Andrew

    2017-01-01

    Limited research and data has been published for the H+ coefficients for the metabolites and reactions involved in non-mitochondrial energy metabolism. The purpose of this investigation was to compute the fractional binding of H+, K+, Na+ and Mg2+ to 21 metabolites of skeletal muscle non-mitochondrial energy metabolism, resulting in 104 different metabolite-cation complexes. Fractional binding of H+ to these metabolite-cation complexes were applied to 17 reactions of skeletal muscle non-mitochondrial energy metabolism, and 8 conditions of the glycolytic pathway based on the source of substrate (glycogen vs. glucose), completeness of glycolytic flux, and the end-point of pyruvate vs. lactate. For pH conditions of 6.0 and 7.0, respectively, H+ coefficients (-'ve values = H+ release) for the creatine kinase, adenylate kinase, AMP deaminase and ATPase reactions were 0.8 and 0.97, -0.13 and -0.02, 1.2 and 1.09, and -0.01 and -0.66, respectively. The glycolytic pathway is net H+ releasing, regardless of lactate production, which consumes 1 H+. For glycolysis fueled by glycogen and ending in either pyruvate or lactate, H+ coefficients for pH 6.0 and 7.0 were -3.97 and -2.01 (pyruvate), and -1.96 and -0.01 (lactate), respectively. When starting with glucose, the same conditions result in H+ coefficients of -3.98 and -2.67, and -1.97 and -0.67, respectively. The most H+ releasing reaction of glycolysis is the glyceraldehyde-3-phosphate dehydrogenase reaction, with H+ coefficients for pH 6.0 and 7.0 of -1.58 and -0.76, respectively. Incomplete flux of substrate through glycolysis would increase net H+ release due to the absence of the pyruvate kinase and lactate dehydrogenase reactions, which collectively result in H+ coefficients for pH 6.0 and 7.0 of 1.35 and 1.88, respectively. The data presented provide an extensive reference source for academics and researchers to accurately profile the balance of protons for all metabolites and reactions of non-mitochondrial energy

  4. Competitive cation binding computations of proton balance for reactions of the phosphagen and glycolytic energy systems within skeletal muscle.

    Directory of Open Access Journals (Sweden)

    Robert Andrew Robergs

    Full Text Available Limited research and data has been published for the H+ coefficients for the metabolites and reactions involved in non-mitochondrial energy metabolism. The purpose of this investigation was to compute the fractional binding of H+, K+, Na+ and Mg2+ to 21 metabolites of skeletal muscle non-mitochondrial energy metabolism, resulting in 104 different metabolite-cation complexes. Fractional binding of H+ to these metabolite-cation complexes were applied to 17 reactions of skeletal muscle non-mitochondrial energy metabolism, and 8 conditions of the glycolytic pathway based on the source of substrate (glycogen vs. glucose, completeness of glycolytic flux, and the end-point of pyruvate vs. lactate. For pH conditions of 6.0 and 7.0, respectively, H+ coefficients (-'ve values = H+ release for the creatine kinase, adenylate kinase, AMP deaminase and ATPase reactions were 0.8 and 0.97, -0.13 and -0.02, 1.2 and 1.09, and -0.01 and -0.66, respectively. The glycolytic pathway is net H+ releasing, regardless of lactate production, which consumes 1 H+. For glycolysis fueled by glycogen and ending in either pyruvate or lactate, H+ coefficients for pH 6.0 and 7.0 were -3.97 and -2.01 (pyruvate, and -1.96 and -0.01 (lactate, respectively. When starting with glucose, the same conditions result in H+ coefficients of -3.98 and -2.67, and -1.97 and -0.67, respectively. The most H+ releasing reaction of glycolysis is the glyceraldehyde-3-phosphate dehydrogenase reaction, with H+ coefficients for pH 6.0 and 7.0 of -1.58 and -0.76, respectively. Incomplete flux of substrate through glycolysis would increase net H+ release due to the absence of the pyruvate kinase and lactate dehydrogenase reactions, which collectively result in H+ coefficients for pH 6.0 and 7.0 of 1.35 and 1.88, respectively. The data presented provide an extensive reference source for academics and researchers to accurately profile the balance of protons for all metabolites and reactions of non

  5. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  6. Effect of reaction environments on the reactivity of PCB (2-chlorobiphenyl) over activated carbon impregnated with palladized iron

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hyeok [Department of Civil Engineering, University of Texas at Arlington, 416 Yates Drive, Arlington, TX 76019-0308 (United States); Al-Abed, Souhail R., E-mail: al-abed.souhail@epa.gov [National Risk Management Research Laboratory, U.S. Environmental Protection Agency, 26 West Martin Luther King Drive, Cincinnati, OH 45268 (United States)

    2010-07-15

    Reactive activated carbon (RAC) impregnated with palladized iron nanoparticles has been developed to treat polychlorinated biphenyls (PCBs). In this study, we evaluated the effects of various reaction environments on the adsorption-mediated dechlorination of 2-chlorobiphenyl (2-ClBP) in the RAC system. The results were discussed in close connection to the implementation issue of the RAC system for the remediation of contaminated sites with PCBs. Adsorption event of 2-ClBP onto RAC limited the overall performance under condition with a 2-ClBP/RAC mass ratio of less than 1.0 x 10{sup -4} above which dechlorination of 2-ClBP adsorbed to RAC was the reaction rate-determining step. Acidic and basic conditions were harmful to 2-ClBP adsorption and iron stability while neutral pH showed the highest adsorption-promoted dechlorination of 2-ClBP and negligible metal leaching. Coexisting natural organic matter (NOM) slightly inhibited 2-ClBP adsorption onto RAC due to the partial partitioning of 2-ClBP into NOM in the liquid phase while the 2-ClBP absorbed into NOM, which also tended to adsorb onto RAC, was less available for the dechlorination reaction. Common anions slowed down 2-ClBP adsorption but adsorbed 2-ClBP was almost simultaneously dechlorinated. Some exceptions included strong inhibitory effect of carbonate species on 2-ClBP adsorption and severe detrimental effect of sulfite on 2-ClBP dechlorination. Results on treatment of 2-ClBP spiked to actual sediment supernatants implied site-specific reactivity of RAC.

  7. Evaluating Students' Perceptions and Attitudes toward Computer-Mediated Project-Based Learning Environment: A Case Study

    Science.gov (United States)

    Seet, Ling Ying Britta; Quek, Choon Lang

    2010-01-01

    This research investigated 68 secondary school students' perceptions of their computer-mediated project-based learning environment and their attitudes towards Project Work (PW) using two instruments--Project Work Classroom Learning Environment Questionnaire (PWCLEQ) and Project Work Related Attitudes Instrument (PWRAI). In this project-based…

  8. The New Learning Ecology of One-to-One Computing Environments: Preparing Teachers for Shifting Dynamics and Relationships

    Science.gov (United States)

    Spires, Hiller A.; Oliver, Kevin; Corn, Jenifer

    2012-01-01

    Despite growing research and evaluation results on one-to-one computing environments, how these environments affect learning in schools remains underexamined. The purpose of this article is twofold: (a) to use a theoretical lens, namely a new learning ecology, to frame the dynamic changes as well as challenges that are introduced by a one-to-one…

  9. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    Science.gov (United States)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  10. Large Reactional Osteogenesis in Maxillary Sinus Associated with Secondary Root Canal Infection Detected Using Cone-beam Computed Tomography.

    Science.gov (United States)

    Estrela, Carlos; Porto, Olavo César Lyra; Costa, Nádia Lago; Garrote, Marcel da Silva; Decurcio, Daniel Almeida; Bueno, Mike R; Silva, Brunno Santos de Freitas

    2015-12-01

    Inflammatory injuries in the maxillary sinus may originate from root canal infections and lead to bone resorption or regeneration. This report describes the radiographic findings of 4 asymptomatic clinical cases of large reactional osteogenesis in the maxillary sinus (MS) associated with secondary root canal infection detected using cone-beam computed tomographic (CBCT) imaging. Apical periodontitis, a consequence of root canal infection, may lead to a periosteal reaction in the MS and osteogenesis seen as a radiopaque structure on imaging scans. The use of a map-reading strategy for the longitudinal and sequential slices of CBCT images may contribute to the definition of diagnoses and treatment plans. Root canal infections may lead to reactional osteogenesis in the MS. High-resolution CBCT images may reveal changes that go unnoticed when using conventional imaging. Findings may help define initial diagnoses and therapeutic plans, but only histopathology provides a definitive diagnosis. Surgical enucleation of the periapical lesion is recommended if nonsurgical root canal treatment fails to control apical periodontitis. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. A general-purpose development environment for intelligent computer-aided training systems

    Science.gov (United States)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  12. Efficient and Adaptive Methods for Computing Accurate Potential Surfaces for Quantum Nuclear Effects: Applications to Hydrogen-Transfer Reactions.

    Science.gov (United States)

    DeGregorio, Nicole; Iyengar, Srinivasan S

    2018-01-09

    We present two sampling measures to gauge critical regions of potential energy surfaces. These sampling measures employ (a) the instantaneous quantum wavepacket density, an approximation to the (b) potential surface, its (c) gradients, and (d) a Shannon information theory based expression that estimates the local entropy associated with the quantum wavepacket. These four criteria together enable a directed sampling of potential surfaces that appears to correctly describe the local oscillation frequencies, or the local Nyquist frequency, of a potential surface. The sampling functions are then utilized to derive a tessellation scheme that discretizes the multidimensional space to enable efficient sampling of potential surfaces. The sampled potential surface is then combined with four different interpolation procedures, namely, (a) local Hermite curve interpolation, (b) low-pass filtered Lagrange interpolation, (c) the monomial symmetrization approximation (MSA) developed by Bowman and co-workers, and (d) a modified Shepard algorithm. The sampling procedure and the fitting schemes are used to compute (a) potential surfaces in highly anharmonic hydrogen-bonded systems and (b) study hydrogen-transfer reactions in biogenic volatile organic compounds (isoprene) where the transferring hydrogen atom is found to demonstrate critical quantum nuclear effects. In the case of isoprene, the algorithm discussed here is used to derive multidimensional potential surfaces along a hydrogen-transfer reaction path to gauge the effect of quantum-nuclear degrees of freedom on the hydrogen-transfer process. Based on the decreased computational effort, facilitated by the optimal sampling of the potential surfaces through the use of sampling functions discussed here, and the accuracy of the associated potential surfaces, we believe the method will find great utility in the study of quantum nuclear dynamics problems, of which application to hydrogen-transfer reactions and hydrogen

  13. Environment

    DEFF Research Database (Denmark)

    Valentini, Chiara

    2017-01-01

    The term environment refers to the internal and external context in which organizations operate. For some scholars, environment is defined as an arrangement of political, economic, social and cultural factors existing in a given context that have an impact on organizational processes and structures....... For others, environment is a generic term describing a large variety of stakeholders and how these interact and act upon organizations. Organizations and their environment are mutually interdependent and organizational communications are highly affected by the environment. This entry examines the origin...... and development of organization-environment interdependence, the nature of the concept of environment and its relevance for communication scholarships and activities....

  14. Patient identity management for secondary use of biomedical research data in a distributed computing environment.

    Science.gov (United States)

    Nitzlnader, Michael; Schreier, Günter

    2014-01-01

    Dealing with data from different source domains is of increasing importance in today's large scale biomedical research endeavours. Within the European Network for Cancer research in Children and Adolescents (ENCCA) a solution to share such data for secondary use will be established. In this paper the solution arising from the aims of the ENCCA project and regulatory requirements concerning data protection and privacy is presented. Since the details of secondary biomedical dataset utilisation are often not known in advance, data protection regulations are met with an identity management concept that facilitates context-specific pseudonymisation and a way of data aggregation using a hidden reference table later on. Phonetic hashing is proposed to prevent duplicated patient registration and re-identification of patients is possible via a trusted third party only. Finally, the solution architecture allows for implementation in a distributed computing environment, including cloud-based elements.

  15. Event heap: a coordination infrastructure for dynamic heterogeneous application interactions in ubiquitous computing environments

    Science.gov (United States)

    Johanson, Bradley E.; Fox, Armando; Winograd, Terry A.; Hanrahan, Patrick M.

    2010-04-20

    An efficient and adaptive middleware infrastructure called the Event Heap system dynamically coordinates application interactions and communications in a ubiquitous computing environment, e.g., an interactive workspace, having heterogeneous software applications running on various machines and devices across different platforms. Applications exchange events via the Event Heap. Each event is characterized by a set of unordered, named fields. Events are routed by matching certain attributes in the fields. The source and target versions of each field are automatically set when an event is posted or used as a template. The Event Heap system implements a unique combination of features, both intrinsic to tuplespaces and specific to the Event Heap, including content based addressing, support for routing patterns, standard routing fields, limited data persistence, query persistence/registration, transparent communication, self-description, flexible typing, logical/physical centralization, portable client API, at most once per source first-in-first-out ordering, and modular restartability.

  16. Virtual environment and computer-aided technologies used for system prototyping and requirements development

    Science.gov (United States)

    Logan, Cory; Maida, James; Goldsby, Michael; Clark, Jim; Wu, Liew; Prenger, Henk

    1993-01-01

    The Space Station Freedom (SSF) Data Management System (DMS) consists of distributed hardware and software which monitor and control the many onboard systems. Virtual environment and off-the-shelf computer technologies can be used at critical points in project development to aid in objectives and requirements development. Geometric models (images) coupled with off-the-shelf hardware and software technologies were used in The Space Station Mockup and Trainer Facility (SSMTF) Crew Operational Assessment Project. Rapid prototyping is shown to be a valuable tool for operational procedure and system hardware and software requirements development. The project objectives, hardware and software technologies used, data gained, current activities, future development and training objectives shall be discussed. The importance of defining prototyping objectives and staying focused while maintaining schedules are discussed along with project pitfalls.

  17. Computer modeling characterization, and applications of Gallium Arsenide Gunn diodes in radiation environments

    Energy Technology Data Exchange (ETDEWEB)

    El- Basit, Wafaa Abd; El-Ghanam, Safaa Mohamed; Kamh, Sanaa Abd El-Tawab [Electronics Research Laboratory, Physics Department, Faculty of Women for Arts, Science and Education, Ain-Shams University, Cairo (Egypt); Abdel-Maksood, Ashraf Mosleh; Soliman, Fouad Abd El-Moniem Saad [Nuclear Materials Authority, Cairo (Egypt)

    2016-10-15

    The present paper reports on a trial to shed further light on the characterization, applications, and operation of radar speed guns or Gunn diodes on different radiation environments of neutron or γ fields. To this end, theoretical and experimental investigations of microwave oscillating system for outer-space applications were carried out. Radiation effects on the transient parameters and electrical properties of the proposed devices have been studied in detail with the application of computer programming. Also, the oscillation parameters, power characteristics, and bias current were plotted under the influence of different γ and neutron irradiation levels. Finally, shelf or oven annealing processes were shown to be satisfactory techniques to recover the initial characteristics of the irradiated devices.

  18. SU-E-J-253: The Radiomics Toolbox in the Computational Environment for Radiological Research (CERR)

    Energy Technology Data Exchange (ETDEWEB)

    Apte, A; Veeraraghavan, H; Oh, J; Kijewski, P; Deasy, J [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2015-06-15

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features and (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.

  19. Integration of Computational and Preparative Techniques to Demonstrate Physical Organic Concepts in Synthetic Organic Chemistry: An Example Using Diels-Alder Reaction

    Science.gov (United States)

    Palmer, David R. J.

    2004-01-01

    The Diels-Alder reaction is used as an example for showing the integration of computational and preparative techniques, which help in demonstrating the physical organic concepts in synthetic organic chemistry. These experiments show that the students should not accept the computational results without questioning them and in many Diels-Alder…

  20. Parallel Computing Characteristics of CUPID code under MPI and Hybrid environment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeon, Byoung Jin; Choi, Hyoung Gwon [Seoul National Univ. of Science and Technology, Seoul (Korea, Republic of)

    2014-05-15

    simulation with diagonal preconditioning shows the better speedup. The MPI library was used for node-to-node communication among partitioned subdomains, and the OpenMP threads were activated in every single node using multi-core computing environment. The results of hybrid computing show good performance comparing the pure MPI parallel computing.

  1. Comparative phyloinformatics of virus genes at micro and macro levels in a distributed computing environment.

    Science.gov (United States)

    Singh, Dadabhai T; Trehan, Rahul; Schmidt, Bertil; Bretschneider, Timo

    2008-01-01

    Preparedness for a possible global pandemic caused by viruses such as the highly pathogenic influenza A subtype H5N1 has become a global priority. In particular, it is critical to monitor the appearance of any new emerging subtypes. Comparative phyloinformatics can be used to monitor, analyze, and possibly predict the evolution of viruses. However, in order to utilize the full functionality of available analysis packages for large-scale phyloinformatics studies, a team of computer scientists, biostatisticians and virologists is needed--a requirement which cannot be fulfilled in many cases. Furthermore, the time complexities of many algorithms involved leads to prohibitive runtimes on sequential computer platforms. This has so far hindered the use of comparative phyloinformatics as a commonly applied tool in this area. In this paper the graphical-oriented workflow design system called Quascade and its efficient usage for comparative phyloinformatics are presented. In particular, we focus on how this task can be effectively performed in a distributed computing environment. As a proof of concept, the designed workflows are used for the phylogenetic analysis of neuraminidase of H5N1 isolates (micro level) and influenza viruses (macro level). The results of this paper are hence twofold. Firstly, this paper demonstrates the usefulness of a graphical user interface system to design and execute complex distributed workflows for large-scale phyloinformatics studies of virus genes. Secondly, the analysis of neuraminidase on different levels of complexity provides valuable insights of this virus's tendency for geographical based clustering in the phylogenetic tree and also shows the importance of glycan sites in its molecular evolution. The current study demonstrates the efficiency and utility of workflow systems providing a biologist friendly approach to complex biological dataset analysis using high performance computing. In particular, the utility of the platform Quascade

  2. Proposed Network Intrusion Detection System ‎Based on Fuzzy c Mean Algorithm in Cloud ‎Computing Environment

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Nowadays cloud computing had become is an integral part of IT industry, cloud computing provides Working environment allow a user of environmental to share data and resources over the internet. Where cloud computing its virtual grouping of resources offered over the internet, this lead to different matters related to the security and privacy in cloud computing. And therefore, create intrusion detection very important to detect outsider and insider intruders of cloud computing with high detection rate and low false positive alarm in the cloud environment. This work proposed network intrusion detection module using fuzzy c mean algorithm. The kdd99 dataset used for experiments .the proposed system characterized by a high detection rate with low false positive alarm

  3. Using animation quality metric to improve efficiency of global illumination computation for dynamic environments

    Science.gov (United States)

    Myszkowski, Karol; Tawara, Takehiro; Seidel, Hans-Peter

    2002-06-01

    In this paper, we consider applications of perception-based video quality metrics to improve the performance of global lighting computations for dynamic environments. For this purpose we extend the Visible Difference Predictor (VDP) developed by Daly to handle computer animations. We incorporate into the VDP the spatio-velocity CSF model developed by Kelly. The CSF model requires data on the velocity of moving patterns across the image plane. We use the 3D image warping technique to compensate for the camera motion, and we conservatively assume that the motion of animated objects (usually strong attractors of the visual attention) is fully compensated by the smooth pursuit eye motion. Our global illumination solution is based on stochastic photon tracing and takes advantage of temporal coherence of lighting distribution, by processing photons both in the spatial and temporal domains. The VDP is used to keep noise inherent in stochastic methods below the sensitivity level of the human observer. As a result a perceptually-consistent quality across all animation frames is obtained.

  4. Efficient frequent pattern mining algorithm based on node sets in cloud computing environment

    Science.gov (United States)

    Billa, V. N. Vinay Kumar; Lakshmanna, K.; Rajesh, K.; Reddy, M. Praveen Kumar; Nagaraja, G.; Sudheer, K.

    2017-11-01

    The ultimate goal of Data Mining is to determine the hidden information which is useful in making decisions using the large databases collected by an organization. This Data Mining involves many tasks that are to be performed during the process. Mining frequent itemsets is the one of the most important tasks in case of transactional databases. These transactional databases contain the data in very large scale where the mining of these databases involves the consumption of physical memory and time in proportion to the size of the database. A frequent pattern mining algorithm is said to be efficient only if it consumes less memory and time to mine the frequent itemsets from the given large database. Having these points in mind in this thesis we proposed a system which mines frequent itemsets in an optimized way in terms of memory and time by using cloud computing as an important factor to make the process parallel and the application is provided as a service. A complete framework which uses a proven efficient algorithm called FIN algorithm. FIN algorithm works on Nodesets and POC (pre-order coding) tree. In order to evaluate the performance of the system we conduct the experiments to compare the efficiency of the same algorithm applied in a standalone manner and in cloud computing environment on a real time data set which is traffic accidents data set. The results show that the memory consumption and execution time taken for the process in the proposed system is much lesser than those of standalone system.

  5. Formative questioning in computer learning environments: a course for pre-service mathematics teachers

    Science.gov (United States)

    Akkoç, Hatice

    2015-11-01

    This paper focuses on a specific aspect of formative assessment, namely questioning. Given that computers have gained widespread use in learning and teaching, specific attention should be made when organizing formative assessment in computer learning environments (CLEs). A course including various workshops was designed to develop knowledge and skills of questioning in CLEs. This study investigates how pre-service mathematics teachers used formative questioning with technological tools such as Geogebra and Graphic Calculus software. Participants are 35 pre-service mathematics teachers. To analyse formative questioning, two types of questions are investigated: mathematical questions and technical questions. Data were collected through lesson plans, teaching notes, interviews and observations. Descriptive statistics of the number of questions in the lesson plans before and after the workshops are presented. Examples of two types of questions are discussed using the theoretical framework. One pre-service teacher was selected and a deeper analysis of the way he used questioning during his three lessons was also investigated. The findings indicated an improvement in using technical questions for formative purposes and that the course provided a guideline in planning and using mathematical and technical questions in CLEs.

  6. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  7. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  8. Interpolation Environment of Tensor Mathematics at the Corpuscular Stage of Computational Experiments in Hydromechanics

    Science.gov (United States)

    Bogdanov, Alexander; Degtyarev, Alexander; Khramushin, Vasily; Shichkina, Yulia

    2018-02-01

    Stages of direct computational experiments in hydromechanics based on tensor mathematics tools are represented by conditionally independent mathematical models for calculations separation in accordance with physical processes. Continual stage of numerical modeling is constructed on a small time interval in a stationary grid space. Here coordination of continuity conditions and energy conservation is carried out. Then, at the subsequent corpuscular stage of the computational experiment, kinematic parameters of mass centers and surface stresses at the boundaries of the grid cells are used in modeling of free unsteady motions of volume cells that are considered as independent particles. These particles can be subject to vortex and discontinuous interactions, when restructuring of free boundaries and internal rheological states has place. Transition from one stage to another is provided by interpolation operations of tensor mathematics. Such interpolation environment formalizes the use of physical laws for mechanics of continuous media modeling, provides control of rheological state and conditions for existence of discontinuous solutions: rigid and free boundaries, vortex layers, their turbulent or empirical generalizations.

  9. Probability-Based Determination Methods for Service Waiting in Service-Oriented Computing Environments

    Science.gov (United States)

    Zeng, Sen; Huang, Shuangxi; Liu, Yang

    Cooperative business processes (CBP)-based service-oriented enterprise networks (SOEN) are emerging with the significant advances of enterprise integration and service-oriented architecture. The performance prediction and optimization for CBP-based SOEN is very complex. To meet these challenges, one of the key points is to try to reduce an abstract service’s waiting number of its physical services. This paper introduces a probability-based determination method (PBDM) of an abstract service’ waiting number, M l , and time span, τ i , for its physical services. The determination of M i and τ i is according to the physical services’ arriving rule and their overall performance’s distribution functions. In PBDM, the arriving probability of the physical services with the best overall performance value is a pre-defined reliability. PBDM has made use of the information of the physical services’ arriving rule and performance distribution functions thoroughly, which will improve the computational efficiency for the scheme design and performance optimization of the collaborative business processes in service-oriented computing environments.

  10. Using Python as a first programming environment for computational physics in developing countries

    Science.gov (United States)

    Akpojotor, Godfrey; Ehwerhemuepha, Louis; Echenim, Myron; Akpojotor, Famous

    2011-03-01

    Python unique features such its interpretative, multiplatform and object oriented nature as well as being a free and open source software creates the possibility that any user connected to the internet can download the entire package into any platform, install it and immediately begin to use it. Thus Python is gaining reputation as a preferred environment for introducing students and new beginners to programming. Therefore in Africa, the Python African Tour project has been launched and we are coordinating its use in computational science. We examine here the challenges and prospects of using Python for computational physics (CP) education in developing countries (DC). Then we present our project on using Python to simulate and aid the learning of laboratory experiments illustrated here by modeling of the simple pendulum and also to visualize phenomena in physics illustrated here by demonstrating the wave motion of a particle in a varying potential. This project which is to train both the teachers and our students on CP using Python can easily be adopted in other DC.

  11. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    Science.gov (United States)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  12. Computational studies of atmospherically-relevant chemical reactions in water clusters and on liquid water and ice surfaces.

    Science.gov (United States)

    Gerber, R Benny; Varner, Mychel E; Hammerich, Audrey D; Riikonen, Sampsa; Murdachaew, Garold; Shemesh, Dorit; Finlayson-Pitts, Barbara J

    2015-02-17

    CONSPECTUS: Reactions on water and ice surfaces and in other aqueous media are ubiquitous in the atmosphere, but the microscopic mechanisms of most of these processes are as yet unknown. This Account examines recent progress in atomistic simulations of such reactions and the insights provided into mechanisms and interpretation of experiments. Illustrative examples are discussed. The main computational approaches employed are classical trajectory simulations using interaction potentials derived from quantum chemical methods. This comprises both ab initio molecular dynamics (AIMD) and semiempirical molecular dynamics (SEMD), the latter referring to semiempirical quantum chemical methods. Presented examples are as follows: (i) Reaction of the (NO(+))(NO3(-)) ion pair with a water cluster to produce the atmospherically important HONO and HNO3. The simulations show that a cluster with four water molecules describes the reaction. This provides a hydrogen-bonding network supporting the transition state. The reaction is triggered by thermal structural fluctuations, and ultrafast changes in atomic partial charges play a key role. This is an example where a reaction in a small cluster can provide a model for a corresponding bulk process. The results support the proposed mechanism for production of HONO by hydrolysis of NO2 (N2O4). (ii) The reactions of gaseous HCl with N2O4 and N2O5 on liquid water surfaces. Ionization of HCl at the water/air interface is followed by nucleophilic attack of Cl(-) on N2O4 or N2O5. Both reactions proceed by an SN2 mechanism. The products are ClNO and ClNO2, precursors of atmospheric atomic chlorine. Because this mechanism cannot result from a cluster too small for HCl ionization, an extended water film model was simulated. The results explain ClNO formation experiments. Predicted ClNO2 formation is less efficient. (iii) Ionization of acids at ice surfaces. No ionization is found on ideal crystalline surfaces, but the process is efficient on

  13. Toward a Computer Vision-based Wayfinding Aid for Blind Persons to Access Unfamiliar Indoor Environments.

    Science.gov (United States)

    Tian, Yingli; Yang, Xiaodong; Yi, Chucai; Arditi, Aries

    2013-04-01

    Independent travel is a well known challenge for blind and visually impaired persons. In this paper, we propose a proof-of-concept computer vision-based wayfinding aid for blind people to independently access unfamiliar indoor environments. In order to find different rooms (e.g. an office, a lab, or a bathroom) and other building amenities (e.g. an exit or an elevator), we incorporate object detection with text recognition. First we develop a robust and efficient algorithm to detect doors, elevators, and cabinets based on their general geometric shape, by combining edges and corners. The algorithm is general enough to handle large intra-class variations of objects with different appearances among different indoor environments, as well as small inter-class differences between different objects such as doors and door-like cabinets. Next, in order to distinguish intra-class objects (e.g. an office door from a bathroom door), we extract and recognize text information associated with the detected objects. For text recognition, we first extract text regions from signs with multiple colors and possibly complex backgrounds, and then apply character localization and topological analysis to filter out background interference. The extracted text is recognized using off-the-shelf optical character recognition (OCR) software products. The object type, orientation, location, and text information are presented to the blind traveler as speech.

  14. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  15. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    2017-10-01

    Full Text Available The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  16. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE Model of Water Resources and Water Environments

    Directory of Open Access Journals (Sweden)

    Guohua Fang

    2016-09-01

    Full Text Available To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and output sources of the National Economic Production Department. Secondly, an extended Social Accounting Matrix (SAM of Jiangsu province is developed to simulate various scenarios. By changing values of the discharge fees (increased by 50%, 100% and 150%, three scenarios are simulated to examine their influence on the overall economy and each industry. The simulation results show that an increased fee will have a negative impact on Gross Domestic Product (GDP. However, waste water may be effectively controlled. Also, this study demonstrates that along with the economic costs, the increase of the discharge fee will lead to the upgrading of industrial structures from a situation of heavy pollution to one of light pollution which is beneficial to the sustainable development of the economy and the protection of the environment.

  17. Monitoring the Microgravity Environment Quality On-Board the International Space Station Using Soft Computing Techniques

    Science.gov (United States)

    Jules, Kenol; Lin, Paul P.

    2001-01-01

    This paper presents an artificial intelligence monitoring system developed by the NASA Glenn Principal Investigator Microgravity Services project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment in time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a graphical display, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, platform structural modes, etc., and decide whether or not to run their experiments based on the acceleration environment associated with a specific event. This monitoring system is focused primarily on detecting the vibratory disturbance sources, but could be used as well to detect some of the transient disturbance sources, depending on the events duration. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.

  18. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.

    Science.gov (United States)

    Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian

    2017-10-10

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  19. The critical role of culture and environment as determinants of women's participation in computer science

    Science.gov (United States)

    Frieze, Carol

    This thesis proposes the need for, and illustrates, a new approach to how we think about, and act on, issues relating to women's participation, or lack of participation, in computer science (CS). This approach is based on a cultural perspective arguing that many of the reasons for women entering---or not entering---CS programs have little to do with gender and a lot to do with environment and culture. Evidence for this approach comes primarily from a qualitative, research study, which shows the effects of changes in the micro-culture on CS undergraduates at Carnegie Mellon, and from studies of other cultural contexts that illustrate a "Women-CS fit". We also discuss the interventions that have been crucial to the evolution of this specific micro-culture. Our argument goes against the grain of many gender and CS studies which conclude that the reasons for women's low participation in CS are based in gender --and particularly in gender differences in how men and women relate to the field. Such studies tend to focus on gender differences and recommend accommodating (what are perceived to be) women's different ways of relating to CS. This is often interpreted as contextualizing the curriculum to make it "female-friendly". The CS curriculum at Carnegie Mellon was not contextualized to be "female-friendly". Nevertheless, over the past few years, the school has attracted and graduated well above the US national average for women in undergraduate CS programs. We argue that this is due in large part to changes in the culture and environment of the department. As the environment has shifted from an unbalanced to a more balanced environment (balanced in terms of gender, breadth of student personalities, and professional support for women) the way has been opened for a range of students, including a significant number of women, to participate, and be successful, in the CS major. Our research shows that as men and women inhabit, and participate in, a more balanced environment

  20. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 4

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.; Daveler, S.A.

    1992-10-09

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ``single-point`` thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics.

  1. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.; Daveler, S.A.

    1992-01-01

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ''single-point'' thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics

  2. Computational modeling of chemical reactions and interstitial growth and remodeling involving charged solutes and solid-bound molecules.

    Science.gov (United States)

    Ateshian, Gerard A; Nims, Robert J; Maas, Steve; Weiss, Jeffrey A

    2014-10-01

    Mechanobiological processes are rooted in mechanics and chemistry, and such processes may be modeled in a framework that couples their governing equations starting from fundamental principles. In many biological applications, the reactants and products of chemical reactions may be electrically charged, and these charge effects may produce driving forces and constraints that significantly influence outcomes. In this study, a novel formulation and computational implementation are presented for modeling chemical reactions in biological tissues that involve charged solutes and solid-bound molecules within a deformable porous hydrated solid matrix, coupling mechanics with chemistry while accounting for electric charges. The deposition or removal of solid-bound molecules contributes to the growth and remodeling of the solid matrix; in particular, volumetric growth may be driven by Donnan osmotic swelling, resulting from charged molecular species fixed to the solid matrix. This formulation incorporates the state of strain as a state variable in the production rate of chemical reactions, explicitly tying chemistry with mechanics for the purpose of modeling mechanobiology. To achieve these objectives, this treatment identifies the specific theoretical and computational challenges faced in modeling complex systems of interacting neutral and charged constituents while accommodating any number of simultaneous reactions where reactants and products may be modeled explicitly or implicitly. Several finite element verification problems are shown to agree with closed-form analytical solutions. An illustrative tissue engineering analysis demonstrates tissue growth and swelling resulting from the deposition of chondroitin sulfate, a charged solid-bound molecular species. This implementation is released in the open-source program FEBio ( www.febio.org ). The availability of this framework may be particularly beneficial to optimizing tissue engineering culture systems by examining the

  3. Functional high-resolution computed tomography of pulmonary vascular and airway reactions

    International Nuclear Information System (INIS)

    Herold, C.J.; Johns Hopkins Medical Institutions, Baltimore, MD; Brown, R.H.; Johns Hopkins Medical Institutions, Baltimore, MD; Johns Hopkins Medical Institutions, Baltimore, MD; Wetzel, R.C.; Herold, S.M.; Zeerhouni, E.A.

    1993-01-01

    We describe the use of high-resolution computed tomography (HRCT) for assessment of the function of pulmonary vessels and airways. With its excellent spatial resolution, HRCT is able to demonstrate pulmonary structures as small as 300 μm and can be used to monitor changes following various stimuli. HRCT also provides information about structures smaller than 300 μm through measurement of parenchymal background density. To date, sequential, spiral and ultrafast HRCT techniques have been used in a variety of challenges to gather information about the anatomical correlates of traditional physiological measurements, thus making anatomical-physiological correlation possible. HRCT of bronchial reactivity can demonstrate the location and time course of aerosol-induced broncho-constriction and may show changes not apparent on spirometry. HRCT of the pulmonary vascular system visualized adaptations of vessels during hypoxia and intravascular volume loading and elucidates cardiorespiratory interactions. Experimental studies provide a basis for potential clinical applications of this method. (orig.) [de

  4. Hydrodynamics and Water Quality forecasting over a Cloud Computing environment: INDIGO-DataCloud

    Science.gov (United States)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; García, Daniel; Monteoliva, Agustín

    2017-04-01

    Algae Bloom due to eutrophication is an extended problem for water reservoirs and lakes that impacts directly in water quality. It can create a dead zone that lacks enough oxygen to support life and it can also be human harmful, so it must be controlled in water masses for supplying, bathing or other uses. Hydrodynamic and Water Quality modelling can contribute to forecast the status of the water system in order to alert authorities before an algae bloom event occurs. It can be used to predict scenarios and find solutions to reduce the harmful impact of the blooms. High resolution models need to process a big amount of data using a robust enough computing infrastructure. INDIGO-DataCloud (https://www.indigo-datacloud.eu/) is an European Commission funded project that aims at developing a data and computing platform targeting scientific communities, deployable on multiple hardware and provisioned over hybrid (private or public) e-infrastructures. The project addresses the development of solutions for different Case Studies using different Cloud-based alternatives. In the first INDIGO software release, a set of components are ready to manage the deployment of services to perform N number of Delft3D simulations (for calibrating or scenario definition) over a Cloud Computing environment, using the Docker technology: TOSCA requirement description, Docker repository, Orchestrator, AAI (Authorization, Authentication) and OneData (Distributed Storage System). Moreover, the Future Gateway portal based on Liferay, provides an user-friendly interface where the user can configure the simulations. Due to the data approach of INDIGO, the developed solutions can contribute to manage the full data life cycle of a project, thanks to different tools to manage datasets or even metadata. Furthermore, the cloud environment contributes to provide a dynamic, scalable and easy-to-use framework for non-IT experts users. This framework is potentially capable to automatize the processing of

  5. Using the computer-driven VR environment to promote experiences of natural world immersion

    Science.gov (United States)

    Frank, Lisa A.

    2013-03-01

    In December, 2011, over 800 people experienced the exhibit, :"der"//pattern for a virtual environment, created for the fully immersive CAVETM at the University of Wisconsin-Madison. This exhibition took my nature-based photographic work and reinterpreted it for virtual reality (VR).Varied responses such as: "It's like a moment of joy," or "I had to see it twice," or "I'm still thinking about it weeks later" were common. Although an implied goal of my 2D artwork is to create a connection that makes viewers more aware of what it means to be a part of the natural world, these six VR environments opened up an unexpected area of inquiry that my 2D work has not. Even as the experience was mediated by machines, there was a softening at the interface between technology and human sensibility. Somehow, for some people, through the unlikely auspices of a computer-driven environment, the project spoke to a human essence that they connected with in a way that went beyond all expectations and felt completely out of my hands. Other interesting behaviors were noted: in some scenarios some spoke of intense anxiety, acrophobia, claustrophobia-even fear of death when the scene took them underground. These environments were believable enough to cause extreme responses and disorientation for some people; were fun, pleasant and wonder-filled for most; and were liberating, poetic and meditative for many others. The exhibition seemed to promote imaginative skills, creativity, emotional insight, and environmental sensitivity. It also revealed the CAVETM to be a powerful tool that can encourage uniquely productive experiences. Quite by accident, I watched as these nature-based environments revealed and articulated an essential relationship between the human spirit and the physical world. The CAVETM is certainly not a natural space, but there is clear potential to explore virtual environments as a path to better and deeper connections between people and nature. We've long associated contact

  6. Computing the cross sections of nuclear reactions with nuclear clusters emission for proton energies between 30 MeV and 2.6 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Korovin, Yu. A.; Maksimushkina, A. V., E-mail: AVMaksimushkina@mephi.ru; Frolova, T. A. [Obninsk Institute for Nuclear Power Engineering, National Research Nuclear University MEPhI (Moscow Engineering Physics Institute) (Russian Federation)

    2016-12-15

    The cross sections of nuclear reactions involving emission of clusters of light nuclei in proton collisions with a heavy-metal target are computed for incident-proton energies between 30 MeV and 2.6 GeV. The calculation relies on the ALICE/ASH and CASCADE/INPE computer codes. The parameters determining the pre-equilibrium cluster emission are varied in the computation.

  7. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  8. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Carolyn L., E-mail: wangcl@uw.edu [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Schopp, Jennifer G.; Kani, Kimia [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Petscavage-Thomas, Jonelle M. [Penn State Hershey Medical Center, Department of Radiology, 500 University Drive, Hershey, PA 17033 (United States); Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H. [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States)

    2013-12-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation.

  9. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    International Nuclear Information System (INIS)

    Wang, Carolyn L.; Schopp, Jennifer G.; Kani, Kimia; Petscavage-Thomas, Jonelle M.; Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H.

    2013-01-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation

  10. Optimizing the Use of Storage Systems Provided by Cloud Computing Environments

    Science.gov (United States)

    Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.

    2013-12-01

    Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and

  11. CDF-XL: computing cumulative distribution functions of reaction time data in Excel.

    Science.gov (United States)

    Houghton, George; Grange, James A

    2011-12-01

    In experimental psychology, central tendencies of reaction time (RT) distributions are used to compare different experimental conditions. This emphasis on the central tendency ignores additional information that may be derived from the RT distribution itself. One method for analysing RT distributions is to construct cumulative distribution frequency plots (CDFs; Ratcliff, Psychological Bulletin 86:446-461, 1979). However, this method is difficult to implement in widely available software, severely restricting its use. In this report, we present an Excel-based program, CDF-XL, for constructing and analysing CDFs, with the aim of making such techniques more readily accessible to researchers, including students (CDF-XL can be downloaded free of charge from the Psychonomic Society's online archive). CDF-XL functions as an Excel workbook and starts from the raw experimental data, organised into three columns (Subject, Condition, and RT) on an Input Data worksheet (a point-and-click utility is provided for achieving this format from a broader data set). No further preprocessing or sorting of the data is required. With one click of a button, CDF-XL will generate two forms of cumulative analysis: (1) "standard" CDFs, based on percentiles of participant RT distributions (by condition), and (2) a related analysis employing the participant means of rank-ordered RT bins. Both analyses involve partitioning the data in similar ways, but the first uses a "median"-type measure at the participant level, while the latter uses the mean. The results are presented in three formats: (i) by participants, suitable for entry into further statistical analysis; (ii) grand means by condition; and (iii) completed CDF plots in Excel charts.

  12. Computational study of ethanol adsorption and reaction over rutile TiO2 (110) surfaces

    KAUST Repository

    Muir, J. N.

    2012-01-01

    Studies of the modes of adsorption and the associated changes in electronic structures of renewable organic compounds are needed in order to understand the fundamentals behind surface reactions of catalysts for future energies. Using planewave density functional theory (DFT) calculations, the adsorption of ethanol on perfect and O-defected TiO 2 rutile (110) surfaces was examined. On both surfaces the dissociative adsorption mode on five-fold coordinated Ti cations (Ti 4+ 5c) was found to be more favourable than the molecular adsorption mode. On the stoichiometric surface E ads was found to be equal to 0.85 eV for the ethoxide mode and equal to 0.76 eV for the molecular mode. These energies slightly increased when adsorption occurred on the Ti 4+ 5c closest to the O-defected site. However, both considerably increased when adsorption occurred at the removed bridging surface O; interacting with Ti 3+ cations. In this case the dissociative adsorption becomes strongly favoured (E ads = 1.28 eV for molecular adsorption and 2.27 eV for dissociative adsorption). Geometry and electronic structures of adsorbed ethanol were analysed in detail on the stoichiometric surface. Ethanol does not undergo major changes in its structure upon adsorption with its C-O bond rotating nearly freely on the surface. Bonding to surface Ti atoms is a σ type transfer from the O2p of the ethanol-ethoxide species. Both ethanol and ethoxide present potential hole traps on O lone pairs. Charge density and work function analyses also suggest charge transfer from the adsorbate to the surface, in which the dissociative adsorptions show a larger charge transfer than the molecular adsorption mode. This journal is © 2012 the Owner Societies.

  13. SABER: a computational method for identifying active sites for new reactions.

    Science.gov (United States)

    Nosrati, Geoffrey R; Houk, K N

    2012-05-01

    A software suite, SABER (Selection of Active/Binding sites for Enzyme Redesign), has been developed for the analysis of atomic geometries in protein structures, using a geometric hashing algorithm (Barker and Thornton, Bioinformatics 2003;19:1644-1649). SABER is used to explore the Protein Data Bank (PDB) to locate proteins with a specific 3D arrangement of catalytic groups to identify active sites that might be redesigned to catalyze new reactions. As a proof-of-principle test, SABER was used to identify enzymes that have the same catalytic group arrangement present in o-succinyl benzoate synthase (OSBS). Among the highest-scoring scaffolds identified by the SABER search for enzymes with the same catalytic group arrangement as OSBS were L-Ala D/L-Glu epimerase (AEE) and muconate lactonizing enzyme II (MLE), both of which have been redesigned to become effective OSBS catalysts, demonstrated by experiments. Next, we used SABER to search for naturally existing active sites in the PDB with catalytic groups similar to those present in the designed Kemp elimination enzyme KE07. From over 2000 geometric matches to the KE07 active site, SABER identified 23 matches that corresponded to residues from known active sites. The best of these matches, with a 0.28 Å catalytic atom RMSD to KE07, was then redesigned to be compatible with the Kemp elimination using RosettaDesign. We also used SABER to search for potential Kemp eliminases using a theozyme predicted to provide a greater rate acceleration than the active site of KE07, and used Rosetta to create a design based on the proteins identified. Copyright © 2012 The Protein Society.

  14. Cosmic reionization on computers. II. Reionization history and its back-reaction on early galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Gnedin, Nickolay Y. [Particle Astrophysics Center, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Kaurov, Alexander A., E-mail: gnedin@fnal.gov, E-mail: kaurov@uchicago.edu [Department of Astronomy and Astrophysics, The University of Chicago, Chicago, IL 60637 (United States)

    2014-09-20

    We compare the results from several sets of cosmological simulations of cosmic reionization, produced under the Cosmic Reionization On Computers project, with existing observational data on the high-redshift Lyα forest and the abundance of Lyα emitters. We find good consistency with the observational measurements and previous simulation work. By virtue of having several independent realizations for each set of numerical parameters, we are able to explore the effect of cosmic variance on observable quantities. One unexpected conclusion we are forced into is that cosmic variance is unusually large at z > 6, with both our simulations and, most likely, observational measurements still not fully converged for even such basic quantities as the average Gunn-Peterson optical depth or the volume-weighted neutral fraction. We also find that reionization has little effect on the early galaxies or on global cosmic star formation history, because galaxies whose gas content is affected by photoionization contain no molecular (i.e., star-forming) gas in the first place. In particular, measurements of the faint end of the galaxy luminosity function by the James Webb Space Telescope are unlikely to provide a useful constraint on reionization.

  15. Molecular modeling and computational simulation of the photosystem-II reaction center to address isoproturon resistance in Phalaris minor.

    Science.gov (United States)

    Singh, Durg Vijay; Agarwal, Shikha; Kesharwani, Rajesh Kumar; Misra, Krishna

    2012-08-01

    Isoproturon is the only herbicide that can control Phalaris minor, a competitive weed of wheat that developed resistance in 1992. Resistance against isoproturon was reported to be due to a mutation in the psbA gene that encodes the isoproturon-binding D1 protein. Previously in our laboratory, a triazole derivative of isoproturon (TDI) was synthesized and found to be active against both susceptible and resistant biotypes at 0.5 kg/ha but has shown poor specificity. In the present study, both susceptible D1((S)), resistant D1((R)) and D2 proteins of the PS-II reaction center of P. minor have been modeled and simulated, selecting the crystal structure of PS-II from Thermosynechococcus elongatus (2AXT.pdb) as template. Loop regions were refined, and the complete reaction center D1/D2 was simulated with GROMACS in lipid (1-palmitoyl-2-oleoylglycero-3-phosphoglycerol, POPG) environment along with ligands and cofactor. Both S and R models were energy minimized using steepest decent equilibrated with isotropic pressure coupling and temperature coupling using a Berendsen protocol, and subjected to 1,000 ps of MD simulation. As a result of MD simulation, the best model obtained in lipid environment had five chlorophylls, two plastoquinones, two phenophytins and a bicarbonate ion along with cofactor Fe and oxygen evolving center (OEC). The triazole derivative of isoproturon was used as lead molecule for docking. The best worked out conformation of TDI was chosen for receptor-based de novo ligand design. In silico designed molecules were screened and, as a result, only those molecules that show higher docking and binding energies in comparison to isoproturon and its triazole derivative were proposed for synthesis in order to get more potent, non-resistant and more selective TDI analogs.

  16. Being Bullied in Virtual Environments: Experiences and Reactions of Male and Female Students to a Male or Female Oppressor

    Directory of Open Access Journals (Sweden)

    Nicole Krämer

    2018-03-01

    Full Text Available Bullying is a pressing societal problem. As such, it is important to gain a better understanding of the mechanisms involved in bullying and of resilience factors which might protect victims. Moreover, it is necessary to provide tools that can train potential victims to strengthen their resilience. To facilitate both of these goals, the current study tests a recently developed virtual environment that puts participants in the role of a victim who is being oppressed by a superior. In a 2 × 2 between-subjects experiment (N = 81, we measured the effects of gender of the oppressor and gender of the participant on psychophysiological reactions, subjective experiences and willingness to report the event. The results reveal that even when a male and a female bully show the exact same behavior, the male bully is perceived as more threatening. In terms of gender of the victim, the only difference that emerged was a more pronounced increase in heart rate in males. The results were moderated by the personality factors social gender, neuroticism, and need to belong, while self-esteem did not show any moderating influence.

  17. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  18. Integrated Computational study of Material Lifetime in a Fusion Reactor Environment

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, M.; Dudarev, S.; Packer, L.; Zheng, S.; Sublet, J.-C., E-mail: mark.gilbert@ccfe.ac.uk [EURATOM/CCFE Fusion Association, Culham Centre for Fusion Energy, Abingdon (United Kingdom)

    2012-09-15

    Full text: The high-energy, high-intensity neutron fluxes produced by the fusion plasma will have a significant life-limiting impact on reactor components in both experimental and commercial fusion devices. Not only do the neutrons bombarding the materials induce atomic displacement cascades, leading to the accumulation of structural defects, but they also initiate nuclear reactions, which cause transmutation of the elemental atoms. Understanding the implications associated with the resulting compositional changes is one of the key outstanding issues related to fusion energy research. Several complimentary computational techniques have been used to investigate the problem. Firstly, neutron-transport simulations, performed on a reference design for the demonstration fusion power plant (DEMO), quantify the variation in neutron irradiation conditions as a function of geometry. The resulting neutron fluxes and spectra are then used as input into inventory calculations, which allow for the compositional changes of a material to be tracked in time. These calculations reveal that the production of helium (He) gas atoms, whose presence in a material is of particular concern because it can accumulate and cause swelling and embrittlement, will vary significantly, even within the same component of a reactor. Lastly, a density-functional-based model for He-induced grain-boundary embrittlement has been developed to predict the life-limiting consequences associated with relatively low concentrations of He in materials situated at various locations in the DEMO structure. The results suggest that some important fusion materials may be significantly more susceptible to this type of failure than others. (author)

  19. A study of the advantages & disadvantages of mobile cloud computing versus native environment

    OpenAIRE

    Almrot, Emil; Andersson, Sebastian

    2013-01-01

    The advent of cloud computing has enabled the possibility of moving complex calculations and device operations to the “cloud” in an effective way. With cloud computing being applied to mobile devices some of the primary constraints of mobile computing, such as battery life and hardware with less computational power, could be resolved by moving complex operations and computations to the cloud. This thesis aims to identify advantages and disadvantages associated with running cloud based applica...

  20. Enclosure environment characterization testing for the base line validation of computer fire simulation codes

    International Nuclear Information System (INIS)

    Nowlen, S.P.

    1987-03-01

    This report describes a series of fire tests conducted under the direction of Sandia National Laboratories for the US Nuclear Regulatory Commission. The primary purpose of these tests was to provide data against which to validate computer fire environment simulation models to be used in the analysis of nuclear power plant enclosure fire situations. Examples of the data gathered during three of the tests are presented, though the primary objective of this report is to provide a timely description of the test effort itself. These tests were conducted in an enclosure measuring 60x40x20 feet constructed at the Factory Mutual Research Corporation fires test facility in Rhode Island. All of the tests utilized forced ventilation conditions. The ventilation system was designed to simulate typical nuclear power plant installation practices and ventilation rates. A total of 22 tests using simple gas burner, heptane pool, methanol pool, and PMMA solid fires was conducted. Four of these tests were conducted with a full-scale control room mockup in place. Parameters varied during testing were fire intensity, enclosure ventilation rate, and fire location. Data gathered include air temperatures, air velocities, radiative and convective heat flux levels, optical smoke densities, inner and outer enclosure surface temperatures, enclosure surface heat flux levels, and gas concentrations within the enclosure in the exhaust stream

  1. a New Initiative for Tiling, Stitching and Processing Geospatial Big Data in Distributed Computing Environments

    Science.gov (United States)

    Olasz, A.; Nguyen Thai, B.; Kristóf, D.

    2016-06-01

    Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  2. A NEW INITIATIVE FOR TILING, STITCHING AND PROCESSING GEOSPATIAL BIG DATA IN DISTRIBUTED COMPUTING ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2016-06-01

    Full Text Available Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/ developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu. The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows written in different development frameworks (e.g. Python, R or C#. It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.

  3. A Novel Query Method for Spatial Data in Mobile Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Guangsheng Chen

    2018-01-01

    Full Text Available With the development of network communication, a 1000-fold increase in traffic demand from 4G to 5G, it is critical to provide efficient and fast spatial data access interface for applications in mobile environment. In view of the low I/O efficiency and high latency of existing methods, this paper presents a memory-based spatial data query method that uses the distributed memory file system Alluxio to store data and build a two-level index based on the Alluxio key-value structure; moreover, it aims to solve the problem of low efficiency of traditional method; according to the characteristics of Spark computing framework, a data input format for spatial data query is proposed, which can selectively read the file data and reduce the data I/O. The comparative experiments show that the memory-based file system Alluxio has better I/O performance than the disk file system; compared with the traditional distributed query method, the method we proposed reduces the retrieval time greatly.

  4. Parallel sort with a ranged, partitioned key-value store in a high perfomance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron; Poole, Stephen W.

    2016-01-26

    Improved sorting techniques are provided that perform a parallel sort using a ranged, partitioned key-value store in a high performance computing (HPC) environment. A plurality of input data files comprising unsorted key-value data in a partitioned key-value store are sorted. The partitioned key-value store comprises a range server for each of a plurality of ranges. Each input data file has an associated reader thread. Each reader thread reads the unsorted key-value data in the corresponding input data file and performs a local sort of the unsorted key-value data to generate sorted key-value data. A plurality of sorted, ranged subsets of each of the sorted key-value data are generated based on the plurality of ranges. Each sorted, ranged subset corresponds to a given one of the ranges and is provided to one of the range servers corresponding to the range of the sorted, ranged subset. Each range server sorts the received sorted, ranged subsets and provides a sorted range. A plurality of the sorted ranges are concatenated to obtain a globally sorted result.

  5. Recommendations for Integrating a P300-Based Brain Computer Interface in Virtual Reality Environments for Gaming

    Directory of Open Access Journals (Sweden)

    Grégoire Cattan

    2018-05-01

    Full Text Available The integration of a P300-based brain–computer interface (BCI into virtual reality (VR environments is promising for the video games industry. However, it faces several limitations, mainly due to hardware constraints and constraints engendered by the stimulation needed by the BCI. The main limitation is still the low transfer rate that can be achieved by current BCI technology. The goal of this paper is to review current limitations and to provide application creators with design recommendations in order to overcome them. We also overview current VR and BCI commercial products in relation to the design of video games. An essential recommendation is to use the BCI only for non-complex and non-critical tasks in the game. Also, the BCI should be used to control actions that are naturally integrated into the virtual world. Finally, adventure and simulation games, especially if cooperative (multi-user appear the best candidates for designing an effective VR game enriched by BCI technology.

  6. Integrating CAD modules in a PACS environment using a wide computing infrastructure.

    Science.gov (United States)

    Suárez-Cuenca, Jorge J; Tilve, Amara; López, Ricardo; Ferro, Gonzalo; Quiles, Javier; Souto, Miguel

    2017-04-01

    The aim of this paper is to describe a project designed to achieve a total integration of different CAD algorithms into the PACS environment by using a wide computing infrastructure. The aim is to build a system for the entire region of Galicia, Spain, to make CAD accessible to multiple hospitals by employing different PACSs and clinical workstations. The new CAD model seeks to connect different devices (CAD systems, acquisition modalities, workstations and PACS) by means of networking based on a platform that will offer different CAD services. This paper describes some aspects related to the health services of the region where the project was developed, CAD algorithms that were either employed or selected for inclusion in the project, and several technical aspects and results. We have built a standard-based platform with which users can request a CAD service and receive the results in their local PACS. The process runs through a web interface that allows sending data to the different CAD services. A DICOM SR object is received with the results of the algorithms stored inside the original study in the proper folder with the original images. As a result, a homogeneous service to the different hospitals of the region will be offered. End users will benefit from a homogeneous workflow and a standardised integration model to request and obtain results from CAD systems in any modality, not dependant on commercial integration models. This new solution will foster the deployment of these technologies in the entire region of Galicia.

  7. Self-paced brain-computer interface control of ambulation in a virtual reality environment

    Science.gov (United States)

    Wang, Po T.; King, Christine E.; Chui, Luis A.; Do, An H.; Nenadic, Zoran

    2012-10-01

    Objective. Spinal cord injury (SCI) often leaves affected individuals unable to ambulate. Electroencephalogram (EEG) based brain-computer interface (BCI) controlled lower extremity prostheses may restore intuitive and able-body-like ambulation after SCI. To test its feasibility, the authors developed and tested a novel EEG-based, data-driven BCI system for intuitive and self-paced control of the ambulation of an avatar within a virtual reality environment (VRE). Approach. Eight able-bodied subjects and one with SCI underwent the following 10-min training session: subjects alternated between idling and walking kinaesthetic motor imageries (KMI) while their EEG were recorded and analysed to generate subject-specific decoding models. Subjects then performed a goal-oriented online task, repeated over five sessions, in which they utilized the KMI to control the linear ambulation of an avatar and make ten sequential stops at designated points within the VRE. Main results. The average offline training performance across subjects was 77.2±11.0%, ranging from 64.3% (p = 0.001 76) to 94.5% (p = 6.26×10-23), with chance performance being 50%. The average online performance was 8.5±1.1 (out of 10) successful stops and 303±53 s completion time (perfect = 211 s). All subjects achieved performances significantly different than those of random walk (p prosthesis systems may be feasible.

  8. Gesture recognition based on computer vision and glove sensor for remote working environments

    Energy Technology Data Exchange (ETDEWEB)

    Chien, Sung Il; Kim, In Chul; Baek, Yung Mok; Kim, Dong Su; Jeong, Jee Won; Shin, Kug [Kyungpook National University, Taegu (Korea)

    1998-04-01

    In this research, we defined a gesture set needed for remote monitoring and control of a manless system in atomic power station environments. Here, we define a command as the loci of a gesture. We aim at the development of an algorithm using a vision sensor and glove sensors in order to implement the gesture recognition system. The gesture recognition system based on computer vision tracks a hand by using cross correlation of PDOE image. To recognize the gesture word, the 8 direction code is employed as the input symbol for discrete HMM. Another gesture recognition based on sensor has introduced Pinch glove and Polhemus sensor as an input device. The extracted feature through preprocessing now acts as an input signal of the recognizer. For recognition 3D loci of Polhemus sensor, discrete HMM is also adopted. The alternative approach of two foregoing recognition systems uses the vision and and glove sensors together. The extracted mesh feature and 8 direction code from the locus tracking are introduced for further enhancing recognition performance. MLP trained by backpropagation is introduced here and its performance is compared to that of discrete HMM. (author). 32 refs., 44 figs., 21 tabs.

  9. Virtual memory support for distributed computing environments using a shared data object model

    Science.gov (United States)

    Huang, F.; Bacon, J.; Mapp, G.

    1995-12-01

    Conventional storage management systems provide one interface for accessing memory segments and another for accessing secondary storage objects. This hinders application programming and affects overall system performance due to mandatory data copying and user/kernel boundary crossings, which in the microkernel case may involve context switches. Memory-mapping techniques may be used to provide programmers with a unified view of the storage system. This paper extends such techniques to support a shared data object model for distributed computing environments in which good support for coherence and synchronization is essential. The approach is based on a microkernel, typed memory objects, and integrated coherence control. A microkernel architecture is used to support multiple coherence protocols and the addition of new protocols. Memory objects are typed and applications can choose the most suitable protocols for different types of object to avoid protocol mismatch. Low-level coherence control is integrated with high-level concurrency control so that the number of messages required to maintain memory coherence is reduced and system-wide synchronization is realized without severely impacting the system performance. These features together contribute a novel approach to the support for flexible coherence under application control.

  10. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  11. Proceedings of the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology, Volume 1

    Science.gov (United States)

    Hyde, Patricia R.; Loftin, R. Bowen

    1993-01-01

    These proceedings are organized in the same manner as the conference's contributed sessions, with the papers grouped by topic area. These areas are as follows: VE (virtual environment) training for Space Flight, Virtual Environment Hardware, Knowledge Aquisition for ICAT (Intelligent Computer-Aided Training) & VE, Multimedia in ICAT Systems, VE in Training & Education (1 & 2), Virtual Environment Software (1 & 2), Models in ICAT systems, ICAT Commercial Applications, ICAT Architectures & Authoring Systems, ICAT Education & Medical Applications, Assessing VE for Training, VE & Human Systems (1 & 2), ICAT Theory & Natural Language, ICAT Applications in the Military, VE Applications in Engineering, Knowledge Acquisition for ICAT, and ICAT Applications in Aerospace.

  12. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-01-01

    of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation

  13. Exploring the Effects of Web-Mediated Computational Thinking on Developing Students' Computing Skills in a Ubiquitous Learning Environment

    Science.gov (United States)

    Tsai, Chia-Wen; Shen, Pei-Di; Tsai, Meng-Chuan; Chen, Wen-Yu

    2017-01-01

    Much application software education in Taiwan can hardly be regarded as practical. The researchers in this study provided a flexible means of ubiquitous learning (u-learning) with a mobile app for students to access the learning material. In addition, the authors also adopted computational thinking (CT) to help students develop practical computing…

  14. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  15. A computational study of the Diels-Alder reactions between 2,3-dibromo-1,3-butadiene and maleic anhydride

    Science.gov (United States)

    Rivero, Uxía; Meuwly, Markus; Willitsch, Stefan

    2017-09-01

    The neutral and cationic Diels-Alder-type reactions between 2,3-dibromo-1,3-butadiene and maleic anhydride have been computationally explored as the first step of a combined experimental and theoretical study. Density functional theory calculations show that the neutral reaction is concerted while the cationic reaction can be either concerted or stepwise. Further isomerizations of the Diels-Alder products have been studied in order to predict possible fragmentation pathways in gas-phase experiments. Rice-Ramsperger-Kassel-Marcus (RRKM) calculations suggest that under single-collision experimental conditions the neutral product may reform the reactants and the cationic product will most likely eliminate CO2.

  16. Energy Consumption and Indoor Environment Predicted by a Combination of Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm

    2003-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution is introduced for improvement of the predictions of both the energy consumption and the indoor environment.The article describes a calculation...

  17. Examining the Roles of Blended Learning Approaches in Computer-Supported Collaborative Learning (CSCL) Environments: A Delphi Study

    Science.gov (United States)

    So, Hyo-Jeong; Bonk, Curtis J.

    2010-01-01

    In this study, a Delphi method was used to identify and predict the roles of blended learning approaches in computer-supported collaborative learning (CSCL) environments. The Delphi panel consisted of experts in online learning from different geographic regions of the world. This study discusses findings related to (a) pros and cons of blended…

  18. Virtual reality exposure treatment of agoraphobia: a comparison of computer automatic virtual environment and head-mounted display

    NARCIS (Netherlands)

    Meyerbröker, K.; Morina, N.; Kerkhof, G.; Emmelkamp, P.M.G.; Wiederhold, B.K.; Bouchard, S.; Riva, G.

    2011-01-01

    In this study the effects of virtual reality exposure therapy (VRET) were investigated in patients with panic disorder and agoraphobia. The level of presence in VRET was compared between using either a head-mounted display (HMD) or a computer automatic virtual environment (CAVE). Results indicate

  19. The Plant-Window System: A framework for an integrated computing environment at advanced nuclear power plants

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1997-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The extensive use of computer technology in advanced reactor designs provides the opportunity to greatly expand the capability to obtain, analyze, and present data about the plant to station personnel. However, to support highly efficient and increasingly safe operation of nuclear power plants, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and sued, to the proper users throughout the plan. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed On Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications within a common computing environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces so as to define a flexible computing environment for both current generation nuclear power plants and advanced reactor designs

  20. The Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Argumentation Skills

    Science.gov (United States)

    Hsu, P. -S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2015-01-01

    The purpose of this quasi-experimental study was to explore how seventh graders in a suburban school in the United States developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application. A total of 54 students (three classes) comprised this treatment…