WorldWideScience

Sample records for reaction environments computational

  1. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    Science.gov (United States)

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  2. Modelling human behaviours and reactions under dangerous environment

    OpenAIRE

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions...

  3. Modeling human behaviors and reactions under dangerous environment.

    Science.gov (United States)

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  4. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  5. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  6. The reaction environment in a filter-press laboratory reactor: the FM01-LC flow cell

    International Nuclear Information System (INIS)

    Rivera, Fernando F.; León, Carlos Ponce de; Walsh, Frank C.; Nava, José L.

    2015-01-01

    A parallel plate cell facilitating controlled flow in a rectangular channel and capable of incorporating a wide range of electrode materials is important in studies of electrode reactions prior to process development and scale-up. The FM01-LC, a versatile laboratory-scale, plane parallel filter-press type electrochemical cell (having a projected electrode area of 64 cm 2 ) which is based on the larger FM21-SP electrolyser (2100 cm 2 area). Many laboratories have used this type of reactor to quantify the importance of reaction environment in fundamental studies and to prepare for industrial applications. A number of papers have concerned the experimental characterization and computational modelling of its reaction environment but the experimental and computational data has become dispersed. The cell has been used in a diverse range of synthesis and processing applications which require controlled flow and known reaction environment. In a previous review, the cell construction and reaction environment was summarised followed by the illustration of its use for a range of applications that include organic and inorganic electrosynthesis, metal ion removal, energy storage, environmental remediation (e.g., metal recycling or anodic destruction of organics) and drinking water treatment. This complementary review considers the characteristics of the FM01-LC electrolyser as an example of a well-engineered flow cell facilitating cell scale-up and provides a rigorous analysis of its reaction environment. Particular aspects include the influence of electrolyte velocity on mass transport rates, flow dispersion and current distribution

  7. Reaction Diffusion Voronoi Diagrams: From Sensors Data to Computing

    Directory of Open Access Journals (Sweden)

    Alejandro Vázquez-Otero

    2015-05-01

    Full Text Available In this paper, a new method to solve computational problems using reaction diffusion (RD systems is presented. The novelty relies on the use of a model configuration that tailors its spatiotemporal dynamics to develop Voronoi diagrams (VD as a part of the system’s natural evolution. The proposed framework is deployed in a solution of related robotic problems, where the generalized VD are used to identify topological places in a grid map of the environment that is created from sensor measurements. The ability of the RD-based computation to integrate external information, like a grid map representing the environment in the model computational grid, permits a direct integration of sensor data into the model dynamics. The experimental results indicate that this method exhibits significantly less sensitivity to noisy data than the standard algorithms for determining VD in a grid. In addition, previous drawbacks of the computational algorithms based on RD models, like the generation of volatile solutions by means of excitable waves, are now overcome by final stable states.

  8. Plasmon-driven sequential chemical reactions in an aqueous environment.

    Science.gov (United States)

    Zhang, Xin; Wang, Peijie; Zhang, Zhenglong; Fang, Yurui; Sun, Mengtao

    2014-06-24

    Plasmon-driven sequential chemical reactions were successfully realized in an aqueous environment. In an electrochemical environment, sequential chemical reactions were driven by an applied potential and laser irradiation. Furthermore, the rate of the chemical reaction was controlled via pH, which provides indirect evidence that the hot electrons generated from plasmon decay play an important role in plasmon-driven chemical reactions. In acidic conditions, the hot electrons were captured by the abundant H(+) in the aqueous environment, which prevented the chemical reaction. The developed plasmon-driven chemical reactions in an aqueous environment will significantly expand the applications of plasmon chemistry and may provide a promising avenue for green chemistry using plasmon catalysis in aqueous environments under irradiation by sunlight.

  9. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  10. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  11. Elucidating reaction mechanisms on quantum computers.

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  12. Identifying Reaction Pathways and their Environments

    DEFF Research Database (Denmark)

    Maronsson, Jon Bergmann

    Finding the mechanisms and estimating the rate of chemical reactions is an essential part of modern research of atomic scale systems. In this thesis, the application of well established methods for reaction rates and paths to important systems for hydrogen storage is considered before developing...... extensions to further identify the reaction environment for a more accurate rate. Complex borohydrides are materials of high hydrogen storage capacity and high thermodynamic stability (too high for hydrogen storage). In an effort to gain insight into the structural transitions of two such materials, Ca(BH4......-interstitial defects. In good agreement with the experiments, C3-type rotations activate at lower temperature than C2-type rotations. In order to investigate the environment of reaction pathways, a method for finding the ridge between first order saddle points on a multidimensional surface was developed...

  13. Accurate atom-mapping computation for biochemical reactions.

    Science.gov (United States)

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  14. Physiological environment induce quick response - slow exhaustion reactions

    Directory of Open Access Journals (Sweden)

    Noriko eHiroi

    2011-09-01

    Full Text Available In vivo environments are highly crowded and inhomogeneous, which may affect reaction processes in cells. In this study we examined the effects of intracellular crowding and an inhomogeneity on the behavior of in vivo reactions by calculating the spectral dimension (ds, which can be translated into the reaction rate function. We compared estimates of anomaly parameters obtained from Fluorescence Correlation Spectroscopy (FCS data with fractal dimensions derived from Transmission Electron Microscopy (TEM image analysis. FCS analysis indicated that the anomalous property was linked to physiological structure. Subsequent TEM analysis provided an in vivo illustration; soluble molecules likely percolate between intracellular clusters, which are constructed in a self-organizing manner. We estimated a cytoplasmic spectral dimension ds to be 1.39 ± 0.084. This result suggests that in vivo reactions initially run faster than the same reactions in a homogeneous space; this conclusion is consistent with the anomalous character indicated by FCS analysis. We further showed that these results were compatible with our Monte-Carlo simulation in which the anomalous behavior of mobile molecules correlates with the intracellular environment, leading to description as a percolation cluster, as demonstrated using TEM analysis. We confirmed by the simulation that the above-mentioned in vivo like properties are different from those of homogeneously concentrated environments. Additionally, simulation results indicated that crowding level of an environment might affect diffusion rate of reactant. Such knowledge of the spatial information enables us to construct realistic models for in vivo diffusion and reaction systems.

  15. Computer simulation for sodium-concrete reactions

    International Nuclear Information System (INIS)

    Zhang Bin; Zhu Jizhou

    2006-01-01

    In the liquid metal cooled fast breeder reactors (LMFBRs), direct contacts between sodium and concrete is unavoidable. Due to sodium's high chemical reactivity, sodium would react with concrete violently. Lots of hydrogen gas and heat would be released then. This would harm the ignorantly of the containment. This paper developed a program to simualte sodium-conrete reactions across-the-board. It could give the reaction zone temperature, pool temperature, penetration depth, penetration rate, hydrogen flux and reaction heat and so on. Concrete was considered to be composed of silica and water only in this paper. The variable, the quitient of sodium hydroxide, was introduced in the continuity equation to simulate the chemical reactions more realistically. The product of the net gas flux and boundary depth was ably transformed to that of penetration rate and boundary depth. The complex chemical kinetics equations was simplified under some hypothesises. All the technique applied above simplified the computer simulation consumedly. In other words, they made the computer simulation feasible. Theoretics models that applied in the program and the calculation procedure were expatiated in detail. Good agreements of an overall transient behavior were obtained in the series of sodium-concrete reaction experiment analysis. The comparison between the analytical and experimental results showed the program presented in this paper was creditable and reasonable for simulating the sodium-concrete reactions. This program could be used for nuclear safety judgement. (authors)

  16. Glider-based computing in reaction-diffusion hexagonal cellular automata

    International Nuclear Information System (INIS)

    Adamatzky, Andrew; Wuensche, Andrew; De Lacy Costello, Benjamin

    2006-01-01

    A three-state hexagonal cellular automaton, discovered in [Wuensche A. Glider dynamics in 3-value hexagonal cellular automata: the beehive rule. Int J Unconvention Comput, in press], presents a conceptual discrete model of a reaction-diffusion system with inhibitor and activator reagents. The automaton model of reaction-diffusion exhibits mobile localized patterns (gliders) in its space-time dynamics. We show how to implement the basic computational operations with these mobile localizations, and thus demonstrate collision-based logical universality of the hexagonal reaction-diffusion cellular automaton

  17. Computational prediction of chemical reactions: current status and outlook.

    Science.gov (United States)

    Engkvist, Ola; Norrby, Per-Ola; Selmi, Nidhal; Lam, Yu-Hong; Peng, Zhengwei; Sherer, Edward C; Amberg, Willi; Erhard, Thomas; Smyth, Lynette A

    2018-06-01

    Over the past few decades, various computational methods have become increasingly important for discovering and developing novel drugs. Computational prediction of chemical reactions is a key part of an efficient drug discovery process. In this review, we discuss important parts of this field, with a focus on utilizing reaction data to build predictive models, the existing programs for synthesis prediction, and usage of quantum mechanics and molecular mechanics (QM/MM) to explore chemical reactions. We also outline potential future developments with an emphasis on pre-competitive collaboration opportunities. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  19. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Waker, A.J.; Prestwich, W.V.

    1998-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in a cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3 x 10 9 dm 3 mol -1 s -1 . Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (orig.)

  20. The cellular environment in computer simulations of radiation-induced damage to DNA

    International Nuclear Information System (INIS)

    Moiseenko, V.V.; Hamm, R.N.; Waker, A.J.; Prestwich, W.V.

    1988-01-01

    Radiation-induced DNA single- and double-strand breaks were modeled for 660 keV photon radiation and scavenger capacity mimicking the cellular environment. Atomistic representation of DNA in B form with a first hydration shell was utilized to model direct and indirect damage. Monte Carlo generated electron tracks were used to model energy deposition in matter and to derive initial spatial distributions of species which appear in the medium following radiolysis. Diffusion of species was followed with time, and their reactions with DNA and each other were modeled in an encounter-controlled manner. Three methods to account for hydroxyl radical diffusion in cellular environment were tested: assumed exponential survival, time-limited modeling and modeling of reactions between hydroxyl radicals and scavengers in an encounter-controlled manner. Although the method based on modeling scavenging in an encounter-controlled manner is more precise, it requires substantially more computer resources than either the exponential or time-limiting method. Scavenger concentrations of 0.5 and 0.15 M were considered using exponential and encounter-controlled methods with reaction rate set at 3x10 9 dm 3 mol -1 s-1. Diffusion length and strand break yields, predicted by these two methods for the same scavenger molarity, were different by 20%-30%. The method based on limiting time of chemistry follow-up to 10 -9 s leads to DNA damage and radical diffusion estimates similar to 0.5 M scavenger concentration in the other two methods. The difference observed in predictions made by the methods considered could be tolerated in computer simulations of DNA damage. (author)

  1. Computed Potential Energy Surfaces and Minimum Energy Pathways for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such parameters as rate constants as a function of temperature, product branching ratios, and other detailed properties. For some dynamics methods, global potential energy surfaces are required. In this case, it is necessary to obtain the energy at a complete sampling of all the possible arrangements of the nuclei, which are energetically accessible, and then a fitting function must be obtained to interpolate between the computed points. In other cases, characterization of the stationary points and the reaction pathway connecting them is sufficient. These properties may be readily obtained using analytical derivative methods. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method to obtain accurate energetics, gives usefull results for a number of chemically important systems. The talk will focus on a number of applications including global potential energy surfaces, H + O2, H + N2, O(3p) + H2, and reaction pathways for complex reactions, including reactions leading to NO and soot formation in hydrocarbon combustion.

  2. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  3. Reaction-Diffusion Automata Phenomenology, Localisations, Computation

    CERN Document Server

    Adamatzky, Andrew

    2013-01-01

    Reaction-diffusion and excitable media are amongst most intriguing substrates. Despite apparent simplicity of the physical processes involved the media exhibit a wide range of amazing patterns: from target and spiral waves to travelling localisations and stationary breathing patterns. These media are at the heart of most natural processes, including morphogenesis of living beings, geological formations, nervous and muscular activity, and socio-economic developments.   This book explores a minimalist paradigm of studying reaction-diffusion and excitable media using locally-connected networks of finite-state machines: cellular automata and automata on proximity graphs. Cellular automata are marvellous objects per se because they show us how to generate and manage complexity using very simple rules of dynamical transitions. When combined with the reaction-diffusion paradigm the cellular automata become an essential user-friendly tool for modelling natural systems and designing future and emergent computing arch...

  4. Computer-assisted mechanistic evaluation of organic reactions

    Energy Technology Data Exchange (ETDEWEB)

    Gushurst, A.J.

    1988-01-01

    CAMEO, an interactive computer program which predicts the products of organic reactions given starting materials and conditions, has been refined and extended in the area of base-catalyzed and nucleophilic processes. The present capabilities of the program are outlined including brief discussion on the major segments in CAMEO: graphics, perception, and reaction evaluation. The implementation of general algorithms for predicting the acidities of a vast number of organic compounds to within 2 pK{sub a} units in dimethylsulfoxide and water are then described, followed by a presentation of the reactivity rules used by the program to evaluate nucleophilc reactions. Finally, a treatment of sulfur and phosphorus ylides, iminophosphoranes, and P=X-activated anions is given illuminating the various competitions available for these reagents, such as between proton transfer and addition, 1,2- and 1,4-addition, and the Peterson, Wittig, and Horner-Emmons olefination reactions.

  5. Computed Potential Energy Surfaces and Minimum Energy Pathway for Chemical Reactions

    Science.gov (United States)

    Walch, Stephen P.; Langhoff, S. R. (Technical Monitor)

    1994-01-01

    Computed potential energy surfaces are often required for computation of such observables as rate constants as a function of temperature, product branching ratios, and other detailed properties. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method with the Dunning correlation consistent basis sets to obtain accurate energetics, gives useful results for a number of chemically important systems. Applications to complex reactions leading to NO and soot formation in hydrocarbon combustion are discussed.

  6. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    CERN Document Server

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  7. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  8. Security Management Model in Cloud Computing Environment

    OpenAIRE

    Ahmadpanah, Seyed Hossein

    2016-01-01

    In the cloud computing environment, cloud virtual machine (VM) will be more and more the number of virtual machine security and management faced giant Challenge. In order to address security issues cloud computing virtualization environment, this paper presents a virtual machine based on efficient and dynamic deployment VM security management model state migration and scheduling, study of which virtual machine security architecture, based on AHP (Analytic Hierarchy Process) virtual machine de...

  9. Computing, Environment and Life Sciences | Argonne National Laboratory

    Science.gov (United States)

    Computing, Environment and Life Sciences Research Divisions BIOBiosciences CPSComputational Science DSLData Argonne Leadership Computing Facility Biosciences Division Environmental Science Division Mathematics and Computer Science Division Facilities and Institutes Argonne Leadership Computing Facility News Events About

  10. Computational Approach to Electron Charge Transfer Reactions

    DEFF Research Database (Denmark)

    Jónsson, Elvar Örn

    -molecular mechanics scheme, and tools to analyse statistical data and generate relative free energies and free energy surfaces. The methodology is applied to several charge transfer species and reactions in chemical environments - chemical in the sense that solvent, counter ions and substrate surfaces are taken...... in to account - which directly influence the reactants and resulting reaction through both physical and chemical interactions. All methods are though general and can be applied to different types of chemistry. First, the basis of the various theoretical tools is presented and applied to several test systems...... and asymmetric charge transfer reactions between several first-row transition metals in water. The results are compared to experiments and rationalised with classical analytic expressions. Shortcomings of the methods are accounted for with clear steps towards improved accuracy. Later the analysis is extended...

  11. Acorn: A grid computing system for constraint based modeling and visualization of the genome scale metabolic reaction networks via a web interface

    Directory of Open Access Journals (Sweden)

    Bushell Michael E

    2011-05-01

    Full Text Available Abstract Background Constraint-based approaches facilitate the prediction of cellular metabolic capabilities, based, in turn on predictions of the repertoire of enzymes encoded in the genome. Recently, genome annotations have been used to reconstruct genome scale metabolic reaction networks for numerous species, including Homo sapiens, which allow simulations that provide valuable insights into topics, including predictions of gene essentiality of pathogens, interpretation of genetic polymorphism in metabolic disease syndromes and suggestions for novel approaches to microbial metabolic engineering. These constraint-based simulations are being integrated with the functional genomics portals, an activity that requires efficient implementation of the constraint-based simulations in the web-based environment. Results Here, we present Acorn, an open source (GNU GPL grid computing system for constraint-based simulations of genome scale metabolic reaction networks within an interactive web environment. The grid-based architecture allows efficient execution of computationally intensive, iterative protocols such as Flux Variability Analysis, which can be readily scaled up as the numbers of models (and users increase. The web interface uses AJAX, which facilitates efficient model browsing and other search functions, and intuitive implementation of appropriate simulation conditions. Research groups can install Acorn locally and create user accounts. Users can also import models in the familiar SBML format and link reaction formulas to major functional genomics portals of choice. Selected models and simulation results can be shared between different users and made publically available. Users can construct pathway map layouts and import them into the server using a desktop editor integrated within the system. Pathway maps are then used to visualise numerical results within the web environment. To illustrate these features we have deployed Acorn and created a

  12. Intelligent computing for sustainable energy and environment

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang [Queen' s Univ. Belfast (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Li, Shaoyuan; Li, Dewei [Shanghai Jiao Tong Univ., Shanghai (China). Dept. of Automation; Niu, Qun (eds.) [Shanghai Univ. (China). School of Mechatronic Engineering and Automation

    2013-07-01

    Fast track conference proceedings. State of the art research. Up to date results. This book constitutes the refereed proceedings of the Second International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2012, held in Shanghai, China, in September 2012. The 60 full papers presented were carefully reviewed and selected from numerous submissions and present theories and methodologies as well as the emerging applications of intelligent computing in sustainable energy and environment.

  13. HeNCE: A Heterogeneous Network Computing Environment

    Directory of Open Access Journals (Sweden)

    Adam Beguelin

    1994-01-01

    Full Text Available Network computing seeks to utilize the aggregate resources of many networked computers to solve a single problem. In so doing it is often possible to obtain supercomputer performance from an inexpensive local area network. The drawback is that network computing is complicated and error prone when done by hand, especially if the computers have different operating systems and data formats and are thus heterogeneous. The heterogeneous network computing environment (HeNCE is an integrated graphical environment for creating and running parallel programs over a heterogeneous collection of computers. It is built on a lower level package called parallel virtual machine (PVM. The HeNCE philosophy of parallel programming is to have the programmer graphically specify the parallelism of a computation and to automate, as much as possible, the tasks of writing, compiling, executing, debugging, and tracing the network computation. Key to HeNCE is a graphical language based on directed graphs that describe the parallelism and data dependencies of an application. Nodes in the graphs represent conventional Fortran or C subroutines and the arcs represent data and control flow. This article describes the present state of HeNCE, its capabilities, limitations, and areas of future research.

  14. CHPS IN CLOUD COMPUTING ENVIRONMENT

    OpenAIRE

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  15. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  16. Computer network environment planning and analysis

    Science.gov (United States)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  17. Towards reaction-diffusion computing devices based on minority-carrier transport in semiconductors

    International Nuclear Information System (INIS)

    Asai, Tetsuya; Adamatzky, Andrew; Amemiya, Yoshihito

    2004-01-01

    Reaction-diffusion (RD) chemical systems are known to realize sensible computation when both data and results of the computation are encoded in concentration profiles of chemical species; the computation is implemented via spreading and interaction of either diffusive or phase waves. Thin-layer chemical systems are thought of therefore as massively-parallel locally-connected computing devices, where micro-volume of the medium is analogous to an elementary processor. Practical applications of the RD chemical systems are reduced however due to very low speed of traveling waves which makes real-time computation senseless. To overcome the speed-limitations while preserving unique features of RD computers we propose a semiconductor RD computing device where minority carriers diffuse as chemical species and reaction elements are represented by p-n-p-n diodes. We offer blue-prints of the RD semiconductor devices, and study in computer simulation propagation phenomena of the density wave of minority carriers. We then demonstrate what computational problems can be solved in RD semiconductor devices and evaluate space-time complexity of computation in the devices

  18. Printing in Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Karapantelakis, Athanasios; Delvic, Alisa; Zarifi Eslami, Mohammed; Khamit, Saltanat

    Document printing has long been considered an indispensable part of the workspace. While this process is considered trivial and simple for environments where resources are ample (e.g. desktop computers connected to printers within a corporate network), it becomes complicated when applied in a mobile

  19. Investigation of Coal-biomass Catalytic Gasification using Experiments, Reaction Kinetics and Computational Fluid Dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, Francine [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Agblevor, Foster [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Klein, Michael [Univ. of Delaware, Newark, DE (United States); Sheikhi, Reza [Northeastern Univ., Boston, MA (United States)

    2015-12-31

    A collaborative effort involving experiments, kinetic modeling, and computational fluid dynamics (CFD) was used to understand co-gasification of coal-biomass mixtures. The overall goal of the work was to determine the key reactive properties for coal-biomass mixed fuels. Sub-bituminous coal was mixed with biomass feedstocks to determine the fluidization and gasification characteristics of hybrid poplar wood, switchgrass and corn stover. It was found that corn stover and poplar wood were the best feedstocks to use with coal. The novel approach of this project was the use of a red mud catalyst to improve gasification and lower gasification temperatures. An important results was the reduction of agglomeration of the biomass using the catalyst. An outcome of this work was the characterization of the chemical kinetics and reaction mechanisms of the co-gasification fuels, and the development of a set of models that can be integrated into other modeling environments. The multiphase flow code, MFIX, was used to simulate and predict the hydrodynamics and co-gasification, and results were validated with the experiments. The reaction kinetics modeling was used to develop a smaller set of reactions for tractable CFD calculations that represented the experiments. Finally, an efficient tool was developed, MCHARS, and coupled with MFIX to efficiently simulate the complex reaction kinetics.

  20. Development of tight-binding, chemical-reaction-dynamics simulator for combinatorial computational chemistry

    International Nuclear Information System (INIS)

    Kubo, Momoji; Ando, Minako; Sakahara, Satoshi; Jung, Changho; Seki, Kotaro; Kusagaya, Tomonori; Endou, Akira; Takami, Seiichi; Imamura, Akira; Miyamoto, Akira

    2004-01-01

    Recently, we have proposed a new concept called 'combinatorial computational chemistry' to realize a theoretical, high-throughput screening of catalysts and materials. We have already applied our combinatorial, computational-chemistry approach, mainly based on static first-principles calculations, to various catalysts and materials systems and its applicability to the catalysts and materials design was strongly confirmed. In order to realize more effective and efficient combinatorial, computational-chemistry screening, a high-speed, chemical-reaction-dynamics simulator based on quantum-chemical, molecular-dynamics method is essential. However, to the best of our knowledge, there is no chemical-reaction-dynamics simulator, which has an enough high-speed ability to perform a high-throughput screening. In the present study, we have succeeded in the development of a chemical-reaction-dynamics simulator based on our original, tight-binding, quantum-chemical, molecular-dynamics method, which is more than 5000 times faster than the regular first-principles, molecular-dynamics method. Moreover, its applicability and effectiveness to the atomistic clarification of the methanol-synthesis dynamics at reaction temperature were demonstrated

  1. DEFACTO: A Design Environment for Adaptive Computing Technology

    National Research Council Canada - National Science Library

    Hall, Mary

    2003-01-01

    This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...

  2. Astrophysical Nuclear Reaction Rates in the Dense Metallic Environments

    Science.gov (United States)

    Kilic, Ali Ihsan

    2017-09-01

    Nuclear reaction rates can be enhanced by many orders of magnitude in dense and relatively cold astrophysical plasmas such as in white dwarfs, brown dwarfs, and giant planets. Similar conditions are also present in supernova explosions where the ignition conditions are vital for cosmological models. White dwarfs are compact objects that have both extremely high interior densities and very strong local magnetic fields. For the first time, a new formula has been developed to explain cross section and reaction rate quantities for light elements that includes not only the nuclear component but also the material dependence, magnetic field, and crystal structure dependency in dense metallic environments. I will present the impact of the developed formula on the cross section and reaction rates for light elements. This could have possible technological applications in energy production using nuclear fusion reactions.

  3. The DIII-D Computing Environment: Characteristics and Recent Changes

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1999-01-01

    The DIII-D tokamak national fusion research facility along with its predecessor Doublet III has been operating for over 21 years. The DIII-D computing environment consists of real-time systems controlling the tokamak, heating systems, and diagnostics, and systems acquiring experimental data from instrumentation; major data analysis server nodes performing short term and long term data access and data analysis; and systems providing mechanisms for remote collaboration and the dissemination of information over the world wide web. Computer systems for the facility have undergone incredible changes over the course of time as the computer industry has changed dramatically. Yet there are certain valuable characteristics of the DIII-D computing environment that have been developed over time and have been maintained to this day. Some of these characteristics include: continuous computer infrastructure improvements, distributed data and data access, computing platform integration, and remote collaborations. These characteristics are being carried forward as well as new characteristics resulting from recent changes which have included: a dedicated storage system and a hierarchical storage management system for raw shot data, various further infrastructure improvements including deployment of Fast Ethernet, the introduction of MDSplus, LSF and common IDL based tools, and improvements to remote collaboration capabilities. This paper will describe this computing environment, important characteristics that over the years have contributed to the success of DIII-D computing systems, and recent changes to computer systems

  4. Micro-computer cards for hard industrial environment

    Energy Technology Data Exchange (ETDEWEB)

    Breton, J M

    1984-03-15

    Approximately 60% of present or future distributed systems have, or will have, operational units installed in hard environments. In these applications, which include canalization and industrial motor control, robotics and process control, systems must be easily applied in environments not made for electronic use. The development of card systems in this hard industrial environment, which is found in petrochemical industry and mines is described. National semiconductor CIM card system CMOS technology allows the real time micro computer application to be efficient and functional in hard industrial environments.

  5. Catalytic Upgrading of Biomass-Derived Compounds via C-C Coupling Reactions. Computational and Experimental Studies of Acetaldehyde and Furan Reactions in HZSM-5

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Cong [Argonne National Lab. (ANL), Argonne, IL (United States); Evans, Tabitha J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cheng, Lei [Argonne National Lab. (ANL), Argonne, IL (United States); Nimlos, Mark R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mukarakate, Calvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Robichaud, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Assary, Rajeev S. [Argonne National Lab. (ANL), Argonne, IL (United States); Curtiss, Larry A. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-10-02

    These catalytic C–C coupling and deoxygenation reactions are essential for upgrading of biomass-derived oxygenates to fuel-range hydrocarbons. Detailed understanding of mechanistic and energetic aspects of these reactions is crucial to enabling and improving the catalytic upgrading of small oxygenates to useful chemicals and fuels. Using periodic density functional theory (DFT) calculations, we have investigated the reactions of furan and acetaldehyde in an HZSM-5 zeolite catalyst, a representative system associated with the catalytic upgrading of pyrolysis vapors. Comprehensive energy profiles were computed for self-reactions (i.e., acetaldehyde coupling and furan coupling) and cross-reactions (i.e., acetaldehyde + furan) of this representative mixture. Major products proposed from the computations are further confirmed using temperature controlled mass spectra measurements. Moreover, the computational results show that furan interacts with acetaldehyde in HZSM-5 via an alkylation mechanism, which is more favorable than the self-reactions, indicating that mixing furans with aldehydes could be a promising approach to maximize effective C–C coupling and dehydration while reducing the catalyst deactivation (e.g., coke formation) from aldehyde condensation.

  6. Scheduling multimedia services in cloud computing environment

    Science.gov (United States)

    Liu, Yunchang; Li, Chunlin; Luo, Youlong; Shao, Yanling; Zhang, Jing

    2018-02-01

    Currently, security is a critical factor for multimedia services running in the cloud computing environment. As an effective mechanism, trust can improve security level and mitigate attacks within cloud computing environments. Unfortunately, existing scheduling strategy for multimedia service in the cloud computing environment do not integrate trust mechanism when making scheduling decisions. In this paper, we propose a scheduling scheme for multimedia services in multi clouds. At first, a novel scheduling architecture is presented. Then, We build a trust model including both subjective trust and objective trust to evaluate the trust degree of multimedia service providers. By employing Bayesian theory, the subjective trust degree between multimedia service providers and users is obtained. According to the attributes of QoS, the objective trust degree of multimedia service providers is calculated. Finally, a scheduling algorithm integrating trust of entities is proposed by considering the deadline, cost and trust requirements of multimedia services. The scheduling algorithm heuristically hunts for reasonable resource allocations and satisfies the requirement of trust and meets deadlines for the multimedia services. Detailed simulated experiments demonstrate the effectiveness and feasibility of the proposed trust scheduling scheme.

  7. Embedding Moodle into Ubiquitous Computing Environments

    NARCIS (Netherlands)

    Glahn, Christian; Specht, Marcus

    2010-01-01

    Glahn, C., & Specht, M. (2010). Embedding Moodle into Ubiquitous Computing Environments. In M. Montebello, et al. (Eds.), 9th World Conference on Mobile and Contextual Learning (MLearn2010) (pp. 100-107). October, 19-22, 2010, Valletta, Malta.

  8. Computers and the Environment: Minimizing the Carbon Footprint

    Science.gov (United States)

    Kaestner, Rich

    2009-01-01

    Computers can be good and bad for the environment; one can maximize the good and minimize the bad. When dealing with environmental issues, it's difficult to ignore the computing infrastructure. With an operations carbon footprint equal to the airline industry's, computer energy use is only part of the problem; everyone is also dealing with the use…

  9. The sociability of computer-supported collaborative learning environments

    NARCIS (Netherlands)

    Kreijns, C.J.; Kirschner, P.A.; Jochems, W.M.G.

    2002-01-01

    There is much positive research on computer-supported collaborative learning (CSCL) environments in asynchronous distributed learning groups (DLGs). There is also research that shows that contemporary CSCL environments do not completely fulfil expectations on supporting interactive group learning,

  10. Controlling Chemical Reactions in Confined Environments: Water Dissociation in MOF-74

    Directory of Open Access Journals (Sweden)

    Erika M. A. Fuentes-Fernandez

    2018-02-01

    Full Text Available The confined porous environment of metal organic frameworks (MOFs is an attractive system for studying reaction mechanisms. Compared to flat oxide surfaces, MOFs have the key advantage that they exhibit a well-defined structure and present significantly fewer challenges in experimental characterization. As an example of an important reaction, we study here the dissociation of water—which plays a critical role in biology, chemistry, and materials science—in MOFs and show how the knowledge of the structure in this confined environment allows for an unprecedented level of understanding and control. In particular, combining in-situ infrared spectroscopy and first-principles calculations, we show that the water dissociation reaction can be selectively controlled inside Zn-MOF-74 by alcohol, through both chemical and physical interactions. Methanol is observed to speed up water dissociation by 25% to 100%, depending on the alcohol partial pressure. On the other hand, co-adsorption of isopropanol reduces the speed of the water reaction, due mostly to steric interactions. In addition, we also investigate the stability of the product state after the water dissociation has occurred and find that the presence of additional water significantly stabilizes the dissociated state. Our results show that precise control of reactions within nano-porous materials is possible, opening the way for advances in fields ranging from catalysis to electrochemistry and sensors.

  11. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  12. On some limitations of reaction-diffusion chemical computers in relation to Voronoi diagram and its inversion

    International Nuclear Information System (INIS)

    Adamatzky, Andrew; Lacy Costello, Benjamin de

    2003-01-01

    A reaction-diffusion chemical computer in this context is a planar uniform chemical reactor, where data and results of a computation are represented by concentration profiles of reactants and the computation itself is implemented via the spreading and interaction of diffusive and phase waves. This class of chemical computers are efficient at solving problems with a 'natural' parallelism where data sets are decomposable onto a large number of geographically neighboring domains which are then processed in parallel. Typical problems of this type include image processing, geometrical transformations and optimisation. When chemical based devices are used to solve such problems questions regarding their reproducible, efficiency and the accuracy of their computations arise. In addition to these questions what are the limitations of reaction-diffusion chemical processors--what type of problems cannot currently and are unlikely ever to be solved? To answer the questions we study how a Voronoi diagram is constructed and how it is inverted in a planar chemical processor. We demonstrate that a Voronoi diagram is computed only partially in the chemical processor. We also prove that given a specific Voronoi diagram it is impossible to reconstruct the planar set (from which diagram was computed) in the reaction-diffusion chemical processor. In the Letter we open the first ever line of enquiry into the computational inability of reaction-diffusion chemical computers

  13. A PC/workstation cluster computing environment for reservoir engineering simulation applications

    International Nuclear Information System (INIS)

    Hermes, C.E.; Koo, J.

    1995-01-01

    Like the rest of the petroleum industry, Texaco has been transferring its applications and databases from mainframes to PC's and workstations. This transition has been very positive because it provides an environment for integrating applications, increases end-user productivity, and in general reduces overall computing costs. On the down side, the transition typically results in a dramatic increase in workstation purchases and raises concerns regarding the cost and effective management of computing resources in this new environment. The workstation transition also places the user in a Unix computing environment which, to say the least, can be quite frustrating to learn and to use. This paper describes the approach, philosophy, architecture, and current status of the new reservoir engineering/simulation computing environment developed at Texaco's E and P Technology Dept. (EPTD) in Houston. The environment is representative of those under development at several other large oil companies and is based on a cluster of IBM and Silicon Graphics Intl. (SGI) workstations connected by a fiber-optics communications network and engineering PC's connected to local area networks, or Ethernets. Because computing resources and software licenses are shared among a group of users, the new environment enables the company to get more out of its investments in workstation hardware and software

  14. An Introduction to Computer Forensics: Gathering Evidence in a Computing Environment

    Directory of Open Access Journals (Sweden)

    Henry B. Wolfe

    2001-01-01

    Full Text Available Business has become increasingly dependent on the Internet and computing to operate. It has become apparent that there are issues of evidence gathering in a computing environment, which by their nature are technical and different to other forms of evidence gathering, that must be addressed. This paper offers an introduction to some of the technical issues surrounding this new and specialized field of Computer Forensics. It attempts to identify and describe sources of evidence that can be found on disk data storage devices in the course of an investigation. It also considers sources of copies of email, which can be used in evidence, as well as case building.

  15. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  16. Computational comparison of quantum-mechanical models for multistep direct reactions

    International Nuclear Information System (INIS)

    Koning, A.J.; Akkermans, J.M.

    1993-01-01

    We have carried out a computational comparison of all existing quantum-mechanical models for multistep direct (MSD) reactions. The various MSD models, including the so-called Feshbach-Kerman-Koonin, Tamura-Udagawa-Lenske and Nishioka-Yoshida-Weidenmueller models, have been implemented in a single computer system. All model calculations thus use the same set of parameters and the same numerical techniques; only one adjustable parameter is employed. The computational results have been compared with experimental energy spectra and angular distributions for several nuclear reactions, namely, 90 Zr(p,p') at 80 MeV, 209 Bi(p,p') at 62 MeV, and 93 Nb(n,n') at 25.7 MeV. In addition, the results have been compared with the Kalbach systematics and with semiclassical exciton model calculations. All quantum MSD models provide a good fit to the experimental data. In addition, they reproduce the systematics very well and are clearly better than semiclassical model calculations. We furthermore show that the calculated predictions do not differ very strongly between the various quantum MSD models, leading to the conclusion that the simplest MSD model (the Feshbach-Kerman-Koonin model) is adequate for the analysis of experimental data

  17. The Computer Revolution in Science: Steps towards the realization of computer-supported discovery environments

    NARCIS (Netherlands)

    de Jong, Hidde; Rip, Arie

    1997-01-01

    The tools that scientists use in their search processes together form so-called discovery environments. The promise of artificial intelligence and other branches of computer science is to radically transform conventional discovery environments by equipping scientists with a range of powerful

  18. Nitrogen compounds behavior under irradiation environment

    International Nuclear Information System (INIS)

    Ichikawa, Nagayoshi; Takagi, Junichi; Yotsuyanagi, Tadasu

    1991-01-01

    Laboratory experiments were performed to evaluate nitrogen compounds behavior in liquid phase under irradiation environments. Nitrogen compounds take a chemical form of ammonium ion under reducing condition by gamma irradiation, whereas ammonium ions are rather stable even under oxidizing conditions. Key reactions were pointed out and their reaction rate constants and activation energies were estimated through computer code simulation. A reaction scheme for nitrogen compounds including protonate reaction was proposed. (author)

  19. Computational organic chemistry: bridging theory and experiment in establishing the mechanisms of chemical reactions.

    Science.gov (United States)

    Cheng, Gui-Juan; Zhang, Xinhao; Chung, Lung Wa; Xu, Liping; Wu, Yun-Dong

    2015-02-11

    Understanding the mechanisms of chemical reactions, especially catalysis, has been an important and active area of computational organic chemistry, and close collaborations between experimentalists and theorists represent a growing trend. This Perspective provides examples of such productive collaborations. The understanding of various reaction mechanisms and the insight gained from these studies are emphasized. The applications of various experimental techniques in elucidation of reaction details as well as the development of various computational techniques to meet the demand of emerging synthetic methods, e.g., C-H activation, organocatalysis, and single electron transfer, are presented along with some conventional developments of mechanistic aspects. Examples of applications are selected to demonstrate the advantages and limitations of these techniques. Some challenges in the mechanistic studies and predictions of reactions are also analyzed.

  20. Measuring vigilance decrement using computer vision assisted eye tracking in dynamic naturalistic environments.

    Science.gov (United States)

    Bodala, Indu P; Abbasi, Nida I; Yu Sun; Bezerianos, Anastasios; Al-Nashash, Hasan; Thakor, Nitish V

    2017-07-01

    Eye tracking offers a practical solution for monitoring cognitive performance in real world tasks. However, eye tracking in dynamic environments is difficult due to high spatial and temporal variation of stimuli, needing further and thorough investigation. In this paper, we study the possibility of developing a novel computer vision assisted eye tracking analysis by using fixations. Eye movement data is obtained from a long duration naturalistic driving experiment. Source invariant feature transform (SIFT) algorithm was implemented using VLFeat toolbox to identify multiple areas of interest (AOIs). A new measure called `fixation score' was defined to understand the dynamics of fixation position between the target AOI and the non target AOIs. Fixation score is maximum when the subjects focus on the target AOI and diminishes when they gaze at the non-target AOIs. Statistically significant negative correlation was found between fixation score and reaction time data (r =-0.2253 and pdecrement, the fixation score decreases due to visual attention shifting away from the target objects resulting in an increase in the reaction time.

  1. High performance computing network for cloud environment using simulators

    OpenAIRE

    Singh, N. Ajith; Hemalatha, M.

    2012-01-01

    Cloud computing is the next generation computing. Adopting the cloud computing is like signing up new form of a website. The GUI which controls the cloud computing make is directly control the hardware resource and your application. The difficulty part in cloud computing is to deploy in real environment. Its' difficult to know the exact cost and it's requirement until and unless we buy the service not only that whether it will support the existing application which is available on traditional...

  2. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    Science.gov (United States)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  3. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  4. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  5. Distributed Computations Environment Protection Using Artificial Immune Systems

    Directory of Open Access Journals (Sweden)

    A. V. Moiseev

    2011-12-01

    Full Text Available In this article the authors describe possibility of artificial immune systems applying for distributed computations environment protection from definite types of malicious impacts.

  6. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    Science.gov (United States)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  7. THE VALUE OF CLOUD COMPUTING IN THE BUSINESS ENVIRONMENT

    OpenAIRE

    Mircea GEORGESCU; Marian MATEI

    2013-01-01

    Without any doubt, cloud computing has become one of the most significant trends in any enterprise, not only for IT businesses. Besides the fact that the cloud can offer access to low cost, considerably flexible computing resources, cloud computing also provides the capacity to create a new relationship between business entities and corporate IT departments. The value added to the business environment is given by the balanced use of resources, offered by cloud computing. The cloud mentality i...

  8. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Carolyn L., E-mail: wangcl@uw.edu [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Schopp, Jennifer G.; Kani, Kimia [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Petscavage-Thomas, Jonelle M. [Penn State Hershey Medical Center, Department of Radiology, 500 University Drive, Hershey, PA 17033 (United States); Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H. [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States)

    2013-12-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation.

  9. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    International Nuclear Information System (INIS)

    Wang, Carolyn L.; Schopp, Jennifer G.; Kani, Kimia; Petscavage-Thomas, Jonelle M.; Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H.

    2013-01-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation

  10. Analysis of reaction cross-section production in neutron induced fission reactions on uranium isotope using computer code COMPLET.

    Science.gov (United States)

    Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso

    2018-04-22

    This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Airborne Cloud Computing Environment (ACCE)

    Science.gov (United States)

    Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz

    2011-01-01

    Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.

  12. Design requirements for ubiquitous computing environments for healthcare professionals.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2004-01-01

    Ubiquitous computing environments can support clinical administrative routines in new ways. The aim of such computing approaches is to enhance routine physical work, thus it is important to identify specific design requirements. We studied healthcare professionals in an emergency room and developed the computer-augmented environment NOSTOS to support teamwork in that setting. NOSTOS uses digital pens and paper-based media as the primary input interface for data capture and as a means of controlling the system. NOSTOS also includes a digital desk, walk-up displays, and sensor technology that allow the system to track documents and activities in the workplace. We propose a set of requirements and discuss the value of tangible user interfaces for healthcare personnel. Our results suggest that the key requirements are flexibility in terms of system usage and seamless integration between digital and physical components. We also discuss how ubiquitous computing approaches like NOSTOS can be beneficial in the medical workplace.

  13. System administration of ATLAS TDAQ computing environment

    Science.gov (United States)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  14. A computational glance at organometallic cyclizations and coupling reactions

    OpenAIRE

    Fiser, Béla

    2016-01-01

    210 p. Organometallic chemistry is one of the main research topics in chemical science.Nowadays, organometallic reactions are the subject of intensive theoretical investigations.However, in many cases, only joint experimental and theoretical effortscould reveal the answers what we are looking for.The fruits of such experimental and theoretical co-operations will be presentedhere. In this work, we are going to deal with homogeneous organometallic catalysisusing computational chemical tools....

  15. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  16. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  17. Printing in heterogeneous computer environment at DESY

    International Nuclear Information System (INIS)

    Jakubowski, Z.

    1996-01-01

    The number of registered hosts DESY reaches 3500 while the number of print queues approaches 150. The spectrum of used computing environment is very wide: from MAC's and PC's, through SUN, DEC and SGI machines to the IBM mainframe. In 1994 we used 18 tons of paper. We present a solution for providing print services in such an environment for more than 3500 registered users. The availability of the print service is a serious issue. Using centralized printing has a lot of advantages for software administration but creates single point of failure. We solved this problem partially without using expensive software and hardware. The talk provides information about the DESY central central print spooler concept. None of the systems available on the market provides ready to use reliable solution for all platforms used for DESY. We discuss concepts for installation, administration and monitoring large number of printers. We found a solution for printing both on central computing facilities likewise for support of stand-alone workstations. (author)

  18. Computational Investigation of the Competition between the Concerted Diels-Alder Reaction and Formation of Diradicals in Reactions of Acrylonitrile with Non-Polar Dienes

    Science.gov (United States)

    James, Natalie C.; Um, Joann M.; Padias, Anne B.; Hall, H. K.; Houk, K. N.

    2013-01-01

    The energetics of the Diels-Alder cycloaddition reactions of several 1,3-dienes with acrylonitrile, and the energetics of formation of diradicals, were investigated with density functional theory (B3LYP and M06-2X) and compared to experimental data. For the reaction of 2,3-dimethyl-1,3-butadiene with acrylonitrile, the concerted reaction is favored over the diradical pathway by 2.5 kcal/mol using B3LYP/6-31G(d); experimentally this reaction gives both cycloadduct and copolymer. The concerted cycloaddition of cyclopentadiene with acrylonitrile is preferred computationally over the stepwise pathway by 5.9 kcal/mol; experimentally, only the Diels-Alder adduct is formed. For the reactions of (E)-1,3-pentadiene and acrylonitrile, both cycloaddition and copolymerization were observed experimentally; these trends were mimicked by the computational results, which showed only a 1.2 kcal/mol preference for the concerted pathway. For the reactions of (Z)-1,3-pentadiene and acrylonitrile, the stepwise pathway is preferred by 3.9 kcal/mol, in agreement with previous experimental findings that only polymerization occurs. M06-2X is known to give more accurate activation and reaction energetics but the energies of diradicals are too high. PMID:23758325

  19. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    Science.gov (United States)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  20. SOCON: a computer model for analyzing the behavior of sodium-concrete reactions

    International Nuclear Information System (INIS)

    Nguyen, D.G.; Muhlestein, L.D.

    1985-03-01

    Guided by experimental evidence available to date, ranging from basic laboratory studies to large scale tests, a mechanistic computer model (the SOCON model) has been developed to analyze the behavior of SOdium-CONcrete reactions. The model accounts for the thermal, chemical and mechanical phenomena which interact to determine the consequences of the reactions. Reaction limiting mechanisms could be any process which reduces water release and sodium transport to fresh concrete; the buildup of the inert reaction product layer would increase the resistance to sodium transport; water dry-out would decrease the bubble agitation transport mechanism. However, stress-induced failure of concrete, such as spalling, crushing and cracking, and a massive release of gaseous products (hydrogen, water vapor and CO 2 ) would increase the transport of sodium to the reaction zone. The results of SOCON calculations are in excellent agreement with measurements obtained from large-scale sodium-limestone concrete reaction tests of duration up to 100 hours conducted at the Hanford Engineering Development Laboratory. 8 refs., 7 figs

  1. Computer code PRECIP-II for the calculation of Zr-steam reaction

    International Nuclear Information System (INIS)

    Suzuki, Motoye; Kawasaki, Satoru; Furuta, Teruo

    1978-06-01

    The computer code PRECIP-II developed, a modification of S.Malang's SIMTRAN-I, is to calculate Zr-Steam reaction under LOCA conditions. Improved are the following: 1. treatment of boundary conditions at alpha/beta phase interface during temperature decrease. 2. method of time-mesh control. 3. number of input-controllable parameters, and output format. These improvements made possible physically reasonable calculations for an increased number of temperature history patterns, including the cladding temperature excursion assumed during LOCA. Calculations were made along various transient temperature histories, with the parameters so modified as to enable fitting of numerical results of weight gain, oxide thickness and alpha phase thickness in isothermal reactions to the experimental data. Then the computed results were compared with the corresponding experimental values, which revealed that most of the differences lie within +-10%. Slow cooling effect on ductility change of Zircaloy-4 was investigated with some of the oxidized specimens by a ring compression test; the effect is only slight. (auth.)

  2. VECTR: Virtual Environment Computational Training Resource

    Science.gov (United States)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  3. Determining the reaction in kinematic pairs of certain mechanisms using a digital computer

    Energy Technology Data Exchange (ETDEWEB)

    Chifchieva, V N

    1980-01-01

    In Dorr classifiers, walking excavators, conveyors, sieves and other mechanisms, one finds a triad with a sliding pair. An algorithm is proposed for determining reactions in the kinematic connections of a triad with one, two or three sliding pairs. The algorithm is suitable for use in digital computers. It is based on the transfer function method, and has several advantages over the technnique of determining reactions in kinematic pairs of V. Zinovyev. A concrete example is given of calculating reactions in the connections of a crank and lever mechanism of a walking excavator.

  4. A heterogeneous computing environment to solve the 768-bit RSA challenge

    OpenAIRE

    Kleinjung, Thorsten; Bos, Joppe Willem; Lenstra, Arjen K.; Osvik, Dag Arne; Aoki, Kazumaro; Contini, Scott; Franke, Jens; Thomé, Emmanuel; Jermini, Pascal; Thiémard, Michela; Leyland, Paul; Montgomery, Peter L.; Timofeev, Andrey; Stockinger, Heinz

    2010-01-01

    In December 2009 the 768-bit, 232-digit number RSA-768 was factored using the number field sieve. Overall, the computational challenge would take more than 1700 years on a single, standard core. In the article we present the heterogeneous computing approach, involving different compute clusters and Grid computing environments, used to solve this problem.

  5. A Secure Authenticate Framework for Cloud Computing Environment

    OpenAIRE

    Nitin Nagar; Pradeep k. Jatav

    2014-01-01

    Cloud computing has an important aspect for the companies to build and deploy their infrastructure and application. Data Storage service in the cloud computing is easy as compare to the other data storage services. At the same time, cloud security in the cloud environment is challenging task. Security issues ranging from missing system configuration, lack of proper updates, or unwise user actions from remote data storage. It can expose user’s private data and information to unwanted access. i...

  6. Bridging context management systems for different types of pervasive computing environments

    NARCIS (Netherlands)

    Hesselman, C.E.W.; Benz, Hartmut; Benz, H.P.; Pawar, P.; Liu, F.; Wegdam, M.; Wibbels, Martin; Broens, T.H.F.; Brok, Jacco

    2008-01-01

    A context management system is a distributed system that enables applications to obtain context information about (mobile) users and forms a key component of any pervasive computing environment. Context management systems are however very environment-specific (e.g., specific for home environments)

  7. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT

  8. A Comparative Study of Load Balancing Algorithms in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Cloud Computing is a new trend emerging in IT environment with huge requirements of infrastructure and resources. Load Balancing is an important aspect of cloud computing environment. Efficient load balancing scheme ensures efficient resource utilization by provisioning of resources to cloud users on demand basis in pay as you say manner. Load Balancing may even support prioritizing users by applying appropriate scheduling criteria. This paper presents various load balancing schemes in differ...

  9. Equilibrium chemical reaction of supersonic hydrogen-air jets (the ALMA computer program)

    Science.gov (United States)

    Elghobashi, S.

    1977-01-01

    The ALMA (axi-symmetrical lateral momentum analyzer) program is concerned with the computation of two dimensional coaxial jets with large lateral pressure gradients. The jets may be free or confined, laminar or turbulent, reacting or non-reacting. Reaction chemistry is equilibrium.

  10. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  11. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  12. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    Science.gov (United States)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  13. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  14. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  15. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  16. A FUNCTIONAL MODEL OF COMPUTER-ORIENTED LEARNING ENVIRONMENT OF A POST-DEGREE PEDAGOGICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Kateryna R. Kolos

    2014-06-01

    Full Text Available The study substantiates the need for a systematic study of the functioning of computer-oriented learning environment of a post-degree pedagogical education; it is determined the definition of “functional model of computer-oriented learning environment of a post-degree pedagogical education”; it is built a functional model of computer-oriented learning environment of a post-degree pedagogical education in accordance with the functions of business, information and communication technology, academic, administrative staff and peculiarities of training courses teachers.

  17. Urbancontext: A Management Model For Pervasive Environments In User-Oriented Urban Computing

    Directory of Open Access Journals (Sweden)

    Claudia L. Zuniga-Canon

    2014-01-01

    Full Text Available Nowadays, urban computing has gained a lot of interest for guiding the evolution of citiesinto intelligent environments. These environments are appropriated for individuals’ inter-actions changing in their behaviors. These changes require new approaches that allow theunderstanding of how urban computing systems should be modeled.In this work we present UrbanContext, a new model for designing of urban computingplatforms that applies the theory of roles to manage the individual’s context in urban envi-ronments. The theory of roles helps to understand the individual’s behavior within a socialenvironment, allowing to model urban computing systems able to adapt to individuals statesand their needs.UrbanContext collects data in urban atmospheres and classifies individuals’ behaviorsaccording to their change of roles, to optimize social interaction and offer secure services.Likewise, UrbanContext serves as a generic model to provide interoperability, and to facilitatethe design, implementation and expansion of urban computing systems.

  18. Fostering computational thinking skills with a tangible blocks programming environment

    OpenAIRE

    Turchi, T; Malizia, A

    2016-01-01

    Computational Thinking has recently returned into the limelight as an essential skill to have for both the general public and disciplines outside Computer Science. It encapsulates those thinking skills integral to solving complex problems using a computer, thus widely applicable in our technological society. Several public initiatives such as the Hour of Code successfully introduced it to millions of people of different ages and backgrounds, mostly using Blocks Programming Environments like S...

  19. Applications of the pipeline environment for visual informatics and genomics computations

    Directory of Open Access Journals (Sweden)

    Genco Alex

    2011-07-01

    Full Text Available Abstract Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The

  20. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  1. Collaborative virtual reality environments for computational science and design

    International Nuclear Information System (INIS)

    Papka, M. E.

    1998-01-01

    The authors are developing a networked, multi-user, virtual-reality-based collaborative environment coupled to one or more petaFLOPs computers, enabling the interactive simulation of 10 9 atom systems. The purpose of this work is to explore the requirements for this coupling. Through the design, development, and testing of such systems, they hope to gain knowledge that allows computational scientists to discover and analyze their results more quickly and in a more intuitive manner

  2. A computational framework for the automated construction of glycosylation reaction networks.

    Science.gov (United States)

    Liu, Gang; Neelamegham, Sriram

    2014-01-01

    Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS) data. The features described above are illustrated using three case studies that examine: i) O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii) automated N-linked glycosylation pathway construction; and iii) the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme biochemistry. All

  3. A computational framework for the automated construction of glycosylation reaction networks.

    Directory of Open Access Journals (Sweden)

    Gang Liu

    Full Text Available Glycosylation is among the most common and complex post-translational modifications identified to date. It proceeds through the catalytic action of multiple enzyme families that include the glycosyltransferases that add monosaccharides to growing glycans, and glycosidases which remove sugar residues to trim glycans. The expression level and specificity of these enzymes, in part, regulate the glycan distribution or glycome of specific cell/tissue systems. Currently, there is no systematic method to describe the enzymes and cellular reaction networks that catalyze glycosylation. To address this limitation, we present a streamlined machine-readable definition for the glycosylating enzymes and additional methodologies to construct and analyze glycosylation reaction networks. In this computational framework, the enzyme class is systematically designed to store detailed specificity data such as enzymatic functional group, linkage and substrate specificity. The new classes and their associated functions enable both single-reaction inference and automated full network reconstruction, when given a list of reactants and/or products along with the enzymes present in the system. In addition, graph theory is used to support functions that map the connectivity between two or more species in a network, and that generate subset models to identify rate-limiting steps regulating glycan biosynthesis. Finally, this framework allows the synthesis of biochemical reaction networks using mass spectrometry (MS data. The features described above are illustrated using three case studies that examine: i O-linked glycan biosynthesis during the construction of functional selectin-ligands; ii automated N-linked glycosylation pathway construction; and iii the handling and analysis of glycomics based MS data. Overall, the new computational framework enables automated glycosylation network model construction and analysis by integrating knowledge of glycan structure and enzyme

  4. Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments.

    NARCIS (Netherlands)

    Dewiyanti, Silvia; Brand-Gruwel, Saskia; Jochems, Wim; Broers, Nick

    2008-01-01

    Dewiyanti, S., Brand-Gruwel, S., Jochems, W., & Broers, N. (2007). Students experiences with collaborative learning in asynchronous computer-supported collaborative learning environments. Computers in Human Behavior, 23, 496-514.

  5. A Scheme for Verification on Data Integrity in Mobile Multicloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Laicheng Cao

    2016-01-01

    Full Text Available In order to verify the data integrity in mobile multicloud computing environment, a MMCDIV (mobile multicloud data integrity verification scheme is proposed. First, the computability and nondegeneracy of verification can be obtained by adopting BLS (Boneh-Lynn-Shacham short signature scheme. Second, communication overhead is reduced based on HVR (Homomorphic Verifiable Response with random masking and sMHT (sequence-enforced Merkle hash tree construction. Finally, considering the resource constraints of mobile devices, data integrity is verified by lightweight computing and low data transmission. The scheme improves shortage that mobile device communication and computing power are limited, it supports dynamic data operation in mobile multicloud environment, and data integrity can be verified without using direct source file block. Experimental results also demonstrate that this scheme can achieve a lower cost of computing and communications.

  6. Effects of alpha and gamma radiation on glass reaction in an unsaturated environment

    International Nuclear Information System (INIS)

    Wronkiewicz, D.J.; Young, J.E.; Bates, J.K.

    1990-01-01

    Radiation may effect the long-term performance of glass in an unsaturated repository site by interacting with air, water vapor, or liquid water. The present study examines (1) the effects of alpha or gamma irradiation in a water vapor environment, and (2) the influence of radiolytic products on glass reaction. Results indicate that nitric and organic acids form in an irradiated water vapor environment and are dissolved in thin films of condensed water. Glass samples exposed to these conditions react faster and have a different assemblage of secondary phases than glasses exposed to nonirradiated water vapor environments. 23 refs., 4 figs., 2 tabs

  7. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  8. Kinetics of the high-temperature combustion reactions of dibutylether using composite computational methods

    KAUST Repository

    Rachidi, Mariam El

    2015-01-01

    This paper investigates the high-temperature combustion kinetics of n-dibutyl ether (n-DBE), including unimolecular decomposition, H-abstraction by H, H-migration, and C{single bond}C/C{single bond}O β-scission reactions of the DBE radicals. The energetics of H-abstraction by OH radicals is also studied. All rates are determined computationally using the CBS-QB3 and G4 composite methods in conjunction with conventional transition state theory. The B3LYP/6-311++G(2df,2pd) method is used to optimize the geometries and calculate the frequencies of all reactive species and transition states for use in ChemRate. Some of the rates calculated in this study vary markedly from those obtained for similar reactions of alcohols or alkanes, particularly those pertaining to unimolecular decomposition and β-scission at the α-β C{single bond}C bond. These variations show that analogies to alkanes and alcohols are, in some cases, inappropriate means of estimating the reaction rates of ethers. This emphasizes the need to establish valid rates through computation or experimentation. Such studies are especially important given that ethers exhibit promising biofuel and fuel additive characteristics. © 2014.

  9. Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments

    OpenAIRE

    Gillette, Stefan E.

    2012-01-01

    The phenomenon of “cloud computing” has become ubiquitous among users of the Internet and many commercial applications. Yet, the U.S. Navy has conducted limited research in this nascent technology. This thesis explores the application and integration of cloud computing both at the shipboard level and in a multi-ship environment. A virtual desktop infrastructure, mirroring a shipboard environment, was built and analyzed in the Cloud Lab at the Naval Postgraduate School, which offers a potentia...

  10. Xcache in the ATLAS Distributed Computing Environment

    CERN Document Server

    Hanushevsky, Andrew; The ATLAS collaboration

    2018-01-01

    Built upon the Xrootd Proxy Cache (Xcache), we developed additional features to adapt the ATLAS distributed computing and data environment, especially its data management system RUCIO, to help improve the cache hit rate, as well as features that make the Xcache easy to use, similar to the way the Squid cache is used by the HTTP protocol. We are optimizing Xcache for the HPC environments, and adapting the HL-LHC Data Lakes design as its component for data delivery. We packaged the software in CVMFS, in Docker and Singularity containers in order to standardize the deployment and reduce the cost to resolve issues at remote sites. We are also integrating it into RUCIO as a volatile storage systems, and into various ATLAS workflow such as user analysis,

  11. Bridging Theory and Practice: Developing Guidelines to Facilitate the Design of Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Lisa D. Young

    2003-10-01

    Full Text Available Abstract. The design of computer-based learning environments has undergone a paradigm shift; moving students away from instruction that was considered to promote technical rationality grounded in objectivism, to the application of computers to create cognitive tools utilized in constructivist environments. The goal of the resulting computer-based learning environment design principles is to have students learn with technology, rather than from technology. This paper reviews the general constructivist theory that has guided the development of these environments, and offers suggestions for the adaptation of modest, generic guidelines, not mandated principles, that can be flexibly applied and allow for the expression of true constructivist ideals in online learning environments.

  12. A visualization environment for supercomputing-based applications in computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.

    1993-06-01

    In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

  13. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    Science.gov (United States)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  14. Protect Heterogeneous Environment Distributed Computing from Malicious Code Assignment

    Directory of Open Access Journals (Sweden)

    V. S. Gorbatov

    2011-09-01

    Full Text Available The paper describes the practical implementation of the protection system of heterogeneous environment distributed computing from malicious code for the assignment. A choice of technologies, development of data structures, performance evaluation of the implemented system security are conducted.

  15. Individual Differences in Behavioural Reaction to a Changing Environment in Mice and Rats

    NARCIS (Netherlands)

    Benus, R.F.; Koolhaas, J.M.; Oortmerssen, G.A. van

    1987-01-01

    Aggressive and non-aggressive male mice differ in their reaction to a changing social environment. In order to investigate if this differentiation holds also for non-social situations male mice are trained in a standard maze task, whereafter a change (extramaze and intramaze, respectively) is

  16. Tacit knowledge in action: basic notions of knowledge sharing in computer supported work environments

    OpenAIRE

    Mackenzie Owen, John

    2001-01-01

    An important characteristic of most computer supported work environments is the distribution of work over individuals or teams in different locations. This leads to what we nowadays call `virtual' environments. In these environments communication between actors is to a large degree mediated, i.e. established through communications media (telephone, fax, computer networks) rather in a face-to-face way. Unfortunately, mediated communication limits the effectiveness of knowledge exchange in virt...

  17. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  18. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  19. A synthetic computational environment: To control the spread of respiratory infections in a virtual university

    Science.gov (United States)

    Ge, Yuanzheng; Chen, Bin; liu, Liang; Qiu, Xiaogang; Song, Hongbin; Wang, Yong

    2018-02-01

    Individual-based computational environment provides an effective solution to study complex social events by reconstructing scenarios. Challenges remain in reconstructing the virtual scenarios and reproducing the complex evolution. In this paper, we propose a framework to reconstruct a synthetic computational environment, reproduce the epidemic outbreak, and evaluate management interventions in a virtual university. The reconstructed computational environment includes 4 fundamental components: the synthetic population, behavior algorithms, multiple social networks, and geographic campus environment. In the virtual university, influenza H1N1 transmission experiments are conducted, and gradually enhanced interventions are evaluated and compared quantitatively. The experiment results indicate that the reconstructed virtual environment provides a solution to reproduce complex emergencies and evaluate policies to be executed in the real world.

  20. Operational computer graphics in the flight dynamics environment

    Science.gov (United States)

    Jeletic, James F.

    1989-01-01

    Over the past five years, the Flight Dynamics Division of the National Aeronautics and Space Administration's (NASA's) Goddard Space Flight Center has incorporated computer graphics technology into its operational environment. In an attempt to increase the effectiveness and productivity of the Division, computer graphics software systems have been developed that display spacecraft tracking and telemetry data in 2-d and 3-d graphic formats that are more comprehensible than the alphanumeric tables of the past. These systems vary in functionality from real-time mission monitoring system, to mission planning utilities, to system development tools. Here, the capabilities and architecture of these systems are discussed.

  1. Radiolytic oxidation of propane: computer modeling of the reaction scheme

    International Nuclear Information System (INIS)

    Gupta, A.K.; Hanrahan, R.J.

    1991-01-01

    The oxidation of gaseous propane under gamma radiolysis was studied at 100 torr pressure and 25 o C, at oxygen pressures from 1 to 15 torr. Major oxygen-containing products and their G-values with 10% added oxygen are as follows: acetone, 0.98; i-propyl alcohol, 0.86; propionaldehyde, 0.43; n-propyl alcohol, 0.11; acrolein, 0.14; and allyl alcohol, 0.038. The formation of major oxygen-containing products was explained on the basis that the alkyl radicals combine with molecular oxygen to give peroxyl radicals; the peroxyl radicals react with one another to give alkoxyl radicals, which in turn react with one another to form carbonyl compounds and alcohols. The reaction scheme for the formation of major products was examined using computer modeling based on a mechanism involving 28 reactions. Yields could be brought into agreement with the data within experimental error in nearly all cases. (author)

  2. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  3. A computer program incorporating Pitzer's equations for calculation of geochemical reactions in brines

    Science.gov (United States)

    Plummer, Niel; Parkhurst, D.L.; Fleming, G.W.; Dunkle, S.A.

    1988-01-01

    The program named PHRQPITZ is a computer code capable of making geochemical calculations in brines and other electrolyte solutions to high concentrations using the Pitzer virial-coefficient approach for activity-coefficient corrections. Reaction-modeling capabilities include calculation of (1) aqueous speciation and mineral-saturation index, (2) mineral solubility, (3) mixing and titration of aqueous solutions, (4) irreversible reactions and mineral water mass transfer, and (5) reaction path. The computed results for each aqueous solution include the osmotic coefficient, water activity , mineral saturation indices, mean activity coefficients, total activity coefficients, and scale-dependent values of pH, individual-ion activities and individual-ion activity coeffients , and scale-dependent values of pH, individual-ion activities and individual-ion activity coefficients. A data base of Pitzer interaction parameters is provided at 25 C for the system: Na-K-Mg-Ca-H-Cl-SO4-OH-HCO3-CO3-CO2-H2O, and extended to include largely untested literature data for Fe(II), Mn(II), Sr, Ba, Li, and Br with provision for calculations at temperatures other than 25C. An extensive literature review of published Pitzer interaction parameters for many inorganic salts is given. Also described is an interactive input code for PHRQPITZ called PITZINPT. (USGS)

  4. A computational approach to extinction events in chemical reaction networks with discrete state spaces.

    Science.gov (United States)

    Johnston, Matthew D

    2017-12-01

    Recent work of Johnston et al. has produced sufficient conditions on the structure of a chemical reaction network which guarantee that the corresponding discrete state space system exhibits an extinction event. The conditions consist of a series of systems of equalities and inequalities on the edges of a modified reaction network called a domination-expanded reaction network. In this paper, we present a computational implementation of these conditions written in Python and apply the program on examples drawn from the biochemical literature. We also run the program on 458 models from the European Bioinformatics Institute's BioModels Database and report our results. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Computation of 3D form factors in complex environments

    International Nuclear Information System (INIS)

    Coulon, N.

    1989-01-01

    The calculation of radiant interchange among opaque surfaces in a complex environment poses the general problem of determining the visible and hidden parts of the environment. In many thermal engineering applications, surfaces are separated by radiatively non-participating media and may be idealized as diffuse emitters and reflectors. Consenquently the net radiant energy fluxes are intimately related to purely geometrical quantities called form factors, that take into account hidden parts: the problem is reduced to the form factor evaluation. This paper presents the method developed for the computation of 3D form factors in the finite-element module of the system TRIO, which is a general computer code for thermal and fluid flow analysis. The method is derived from an algorithm devised for synthetic image generation. A comparison is performed with the standard contour integration method also implemented and suited to convex geometries. Several illustrative examples of finite-element thermal calculations in radiating enclosures are given

  6. Self-propagating exothermic reaction analysis in Ti/Al reactive films using experiments and computational fluid dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Seema, E-mail: seema.sen@tu-ilmenau.de [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany); Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Lake, Markus; Kroppen, Norman; Farber, Peter; Wilden, Johannes [Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Schaaf, Peter [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany)

    2017-02-28

    Highlights: • Development of nanoscale Ti/Al multilayer films with 1:1, 1:2 and 1:3 molar ratios. • Characterization of exothermic reaction propagation by experiments and simulation. • The reaction velocity depends on the ignition potentials and molar ratios of the films. • Only 1Ti/3Al films exhibit the unsteady reaction propagation with ripple formation. • CFD simulation shows the time dependent atom mixing and temperature flow during exothermic reaction. - Abstract: This study describes the self-propagating exothermic reaction in Ti/Al reactive multilayer foils by using experiments and computational fluid dynamics simulation. The Ti/Al foils with different molar ratios of 1Ti/1Al, 1Ti/2Al and 1Ti/3Al were fabricated by magnetron sputtering method. Microstructural characteristics of the unreacted and reacted foils were analyzed by using electronic and atomic force microscopes. After an electrical ignition, the influence of ignition potentials on reaction propagation has been experimentally investigated. The reaction front propagates with a velocity of minimum 0.68 ± 0.4 m/s and maximum 2.57 ± 0.6 m/s depending on the input ignition potentials and the chemical compositions. Here, the 1Ti/3Al reactive foil exhibits both steady state and unsteady wavelike reaction propagation. Moreover, the numerical computational fluid dynamics (CFD) simulation shows the time dependent temperature flow and atomic mixing in a nanoscale reaction zone. The CFD simulation also indicates the potentiality for simulating exothermic reaction in the nanoscale Ti/Al foil.

  7. Weighted Local Active Pixel Pattern (WLAPP for Face Recognition in Parallel Computation Environment

    Directory of Open Access Journals (Sweden)

    Gundavarapu Mallikarjuna Rao

    2013-10-01

    Full Text Available Abstract  - The availability of multi-core technology resulted totally new computational era. Researchers are keen to explore available potential in state of art-machines for breaking the bearer imposed by serial computation. Face Recognition is one of the challenging applications on so ever computational environment. The main difficulty of traditional Face Recognition algorithms is lack of the scalability. In this paper Weighted Local Active Pixel Pattern (WLAPP, a new scalable Face Recognition Algorithm suitable for parallel environment is proposed.  Local Active Pixel Pattern (LAPP is found to be simple and computational inexpensive compare to Local Binary Patterns (LBP. WLAPP is developed based on concept of LAPP. The experimentation is performed on FG-Net Aging Database with deliberately introduced 20% distortion and the results are encouraging. Keywords — Active pixels, Face Recognition, Local Binary Pattern (LBP, Local Active Pixel Pattern (LAPP, Pattern computing, parallel workers, template, weight computation.  

  8. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    Science.gov (United States)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  9. When three traits make a line: evolution of phenotypic plasticity and genetic assimilation through linear reaction norms in stochastic environments.

    Science.gov (United States)

    Ergon, T; Ergon, R

    2017-03-01

    Genetic assimilation emerges from selection on phenotypic plasticity. Yet, commonly used quantitative genetics models of linear reaction norms considering intercept and slope as traits do not mimic the full process of genetic assimilation. We argue that intercept-slope reaction norm models are insufficient representations of genetic effects on linear reaction norms and that considering reaction norm intercept as a trait is unfortunate because the definition of this trait relates to a specific environmental value (zero) and confounds genetic effects on reaction norm elevation with genetic effects on environmental perception. Instead, we suggest a model with three traits representing genetic effects that, respectively, (i) are independent of the environment, (ii) alter the sensitivity of the phenotype to the environment and (iii) determine how the organism perceives the environment. The model predicts that, given sufficient additive genetic variation in environmental perception, the environmental value at which reaction norms tend to cross will respond rapidly to selection after an abrupt environmental change, and eventually becomes equal to the new mean environment. This readjustment of the zone of canalization becomes completed without changes in genetic correlations, genetic drift or imposing any fitness costs of maintaining plasticity. The asymptotic evolutionary outcome of this three-trait linear reaction norm generally entails a lower degree of phenotypic plasticity than the two-trait model, and maximum expected fitness does not occur at the mean trait values in the population. © 2016 The Authors. Journal of Evolutionary Biology published by John Wiley & Sons Ltd on behalf of European Society for Evolutionary Biology.

  10. Computed potential energy surfaces for chemical reactions

    Science.gov (United States)

    Walch, Stephen P.

    1988-01-01

    The minimum energy path for the addition of a hydrogen atom to N2 is characterized in CASSCF/CCI calculations using the (4s3p2d1f/3s2p1d) basis set, with additional single point calculations at the stationary points of the potential energy surface using the (5s4p3d2f/4s3p2d) basis set. These calculations represent the most extensive set of ab initio calculations completed to date, yielding a zero point corrected barrier for HN2 dissociation of approx. 8.5 kcal mol/1. The lifetime of the HN2 species is estimated from the calculated geometries and energetics using both conventional Transition State Theory and a method which utilizes an Eckart barrier to compute one dimensional quantum mechanical tunneling effects. It is concluded that the lifetime of the HN2 species is very short, greatly limiting its role in both termolecular recombination reactions and combustion processes.

  11. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  12. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  13. Distributed multiscale computing with MUSCLE 2, the Multiscale Coupling Library and Environment

    NARCIS (Netherlands)

    Borgdorff, J.; Mamonski, M.; Bosak, B.; Kurowski, K.; Ben Belgacem, M.; Chopard, B.; Groen, D.; Coveney, P.V.; Hoekstra, A.G.

    2014-01-01

    We present the Multiscale Coupling Library and Environment: MUSCLE 2. This multiscale component-based execution environment has a simple to use Java, C++, C, Python and Fortran API, compatible with MPI, OpenMP and threading codes. We demonstrate its local and distributed computing capabilities and

  14. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    NARCIS (Netherlands)

    Molenaar, I.; Roda, Claudia; van Boxtel, Carla A.M.; Sleegers, P.J.C.

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N = 56) are supported with computer-generated scaffolds and students in the control condition (N =

  15. Efficient Computation of Transition State Resonances and Reaction Rates from a Quantum Normal Form

    NARCIS (Netherlands)

    Schubert, Roman; Waalkens, Holger; Wiggins, Stephen

    2006-01-01

    A quantum version of a recent formulation of transition state theory in phase space is presented. The theory developed provides an algorithm to compute quantum reaction rates and the associated Gamov-Siegert resonances with very high accuracy. The algorithm is especially efficient for

  16. The Effect of the Equatorial Environment on Oxo-Group Silylation of the Uranyl Dication: A Computational Study

    International Nuclear Information System (INIS)

    Yahia, A.; Maron, L.; Yahia, A.; Arnold, P.L.; Love, J.B.

    2010-01-01

    A theoretical investigation of the reductive oxo-group silylation reaction of the uranyl dication held in a Pacman macrocyclic environment has been carried out. The effect of the modeling of the Pacman ligand on the reaction profiles is found to be important, with the dipotassiation of a single oxo group identified as a key component in promoting the reaction between the Si-X and uranium-oxo bonds. This reductive silylation reaction is also proposed to occur in an aqueous environment but was found not to operate on bare ions; in this latter case, substitution of a ligand in the equatorial plane was the most likely reaction. These results demonstrate the importance of the presence but not the identity of the equatorial ligands upon the silylation of the uranyl U-O bond. (authors)

  17. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  18. A prospective survey of delayed adverse reactions to iohexol in urography and computed tomography

    International Nuclear Information System (INIS)

    Munechika, Hirotsugu; Hiramatsu, Yoshihiro; Kudo, Sho; Sugimura, Kazuro; Hamada, Chikuma; Yamaguchi, Koichi; Katayama, Hitoshi

    2003-01-01

    We investigated 7505 inpatients who underwent intravenous urography or contrast-enhanced computed tomography to assess risk factors for delayed adverse drug reactions to iohexol, a non-ionic iodinated contrast medium. Focusing on delayed adverse reactions, all adverse events were prospectively investigated for 7 days after injection of iohexol. To explore the relevant risk factors, the relationship between occurrence of adverse reactions to iohexol and 17 different variables was evaluated by logistic regression analysis. To assess the influence of seasonal factors, adverse reactions were separately evaluated during two periods: February to April (the pollinosis period in Japan) and July to September (the non-pollinosis period). The prevalence of delayed adverse events and delayed adverse reactions was 3.5 and 2.8%, respectively, whereas the prevalence of adverse events and adverse reactions was 5.7 and 5.0%, respectively. Multivariate analysis showed that six parameters had a significant influence on delayed adverse reactions to iohexol, including (a) a history of allergy, (b) season, (c) radiographic procedure, (d) age, (e) concomitant surgery or other invasive procedures, and (f) concomitant medication. The prevalence of delayed reactions was lower than in previous large-scale studies. Significant risk factors included a history of allergy and performance of radiography during the pollinosis period, suggesting that allergy was involved in delayed adverse reactions. The type of radiographic procedure also had an influence. (orig.)

  19. Computing multi-species chemical equilibrium with an algorithm based on the reaction extents

    DEFF Research Database (Denmark)

    Paz-Garcia, Juan Manuel; Johannesson, Björn; Ottosen, Lisbeth M.

    2013-01-01

    -negative constrains. The residual function, representing the distance to the equilibrium, is defined from the chemical potential (or Gibbs energy) of the chemical system. Local minimums are potentially avoided by the prioritization of the aqueous reactions with respect to the heterogeneous reactions. The formation......A mathematical model for the solution of a set of chemical equilibrium equations in a multi-species and multiphase chemical system is described. The computer-aid solution of model is achieved by means of a Newton-Raphson method enhanced with a line-search scheme, which deals with the non...... and release of gas bubbles is taken into account in the model, limiting the concentration of volatile aqueous species to a maximum value, given by the gas solubility constant.The reaction extents are used as state variables for the numerical method. As a result, the accepted solution satisfies the charge...

  20. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies.

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar

    2017-09-12

    Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the 'accuracy' and 'practicality' of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the

  1. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies

    Directory of Open Access Journals (Sweden)

    Erfan Shahabpoor

    2017-09-01

    Full Text Available Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the ‘accuracy’ and ‘practicality’ of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1 methods based on measured kinematic data; (2 methods based on measured plantar pressure; and (3 methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1 reducing the size and price of tri-axial load-cells; (2 improving the accuracy of orientation measurement using IMUs; (3 minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4 increasing the durability of pressure insole sensors, and (5 enhancing the robustness and

  2. Occurrence, dynamics and reactions of organic pollutants in the indoor environment

    Energy Technology Data Exchange (ETDEWEB)

    Salthammer, Tunga [Material Analysis and Indoor Chemistry, Fraunhofer Wilhelm-Klauditz Institut (WKI), Braunschweig (Germany); Bahadir, Muefit [Institut fuer Oekologische Chemie und Abfallanalytik, Technische Universitaet Braunschweig, Braunschweig (Germany)

    2009-06-15

    The indoor environment is a multidisciplinary scientific field involving chemistry, physics, biology, health sciences, architecture, building sciences and civil engineering. The need for reliable assessment of human exposure to indoor pollutants is attracting increasing attention. This, however, requires a detailed understanding of the relevant compounds, their sources, physical and chemical properties, dynamics, reactions, their distribution among the gas phase, airborne particles and settled dust as well as the availability of modern measurement techniques. Building products, furnishings and other indoor materials often emit volatile and semi-volatile organic compounds. With respect to a healthy indoor environment, only low emitting products, which do not influence indoor air quality in a negative way, should be used in a building. Therefore, materials and products for indoor use need to be evaluated for their chemical emissions. This is routinely done in test chambers and cells. Many studies have shown that the types of sources in occupational and residential indoor environments, the spectrum of emitting compounds and the duration of emission cover a wide range. The demand for standardized test methods under laboratory conditions has resulted in several guidelines for determination of emission rates. Furthermore, it has now been recognized that both primary and secondary emissions may affect indoor air quality. The problem may become more dominant when components of different materials can react with each other or when catalytic materials are applied. Such products derived from indoor related reactions may have a negative impact on indoor air quality due to their low odor threshold, health related properties or the formation of ultrafine particles. Several factors can influence the emission characteristics and numerous investigations have shown that indoor chemistry is of particular importance for the indoor related characterization of building product emissions

  3. Computational Laboratory Astrophysics to Enable Transport Modeling of Protons and Hydrogen in Stellar Winds, the ISM, and other Astrophysical Environments

    Science.gov (United States)

    Schultz, David

    As recognized prominently by the APRA program, interpretation of NASA astrophysical mission observations requires significant products of laboratory astrophysics, for example, spectral lines and transition probabilities, electron-, proton-, or heavy-particle collision data. Availability of these data underpin robust and validated models of astrophysical emissions and absorptions, energy, momentum, and particle transport, dynamics, and reactions. Therefore, measured or computationally derived, analyzed, and readily available laboratory astrophysics data significantly enhances the scientific return on NASA missions such as HST, Spitzer, and JWST. In the present work a comprehensive set of data will be developed for the ubiquitous proton-hydrogen and hydrogen-hydrogen collisions in astrophysical environments including ISM shocks, supernova remnants and bubbles, HI clouds, young stellar objects, and winds within stellar spheres, covering the necessary wide range of energy- and charge-changing channels, collision energies, and most relevant scattering parameters. In addition, building on preliminary work, a transport and reaction simulation will be developed incorporating the elastic and inelastic collision data collected and produced. The work will build upon significant previous efforts of the principal investigators and collaborators, will result in a comprehensive data set required for modeling these environments and interpreting NASA astrophysical mission observations, and will benefit from feedback from collaborators who are active users of the work proposed.

  4. Preserving access to ALEPH computing environment via virtual machines

    International Nuclear Information System (INIS)

    Coscetti, Simone; Boccali, Tommaso; Arezzini, Silvia; Maggi, Marcello

    2014-01-01

    The ALEPH Collaboration [1] took data at the LEP (CERN) electron-positron collider in the period 1989-2000, producing more than 300 scientific papers. While most of the Collaboration activities stopped in the last years, the data collected still has physics potential, with new theoretical models emerging, which ask checks with data at the Z and WW production energies. An attempt to revive and preserve the ALEPH Computing Environment is presented; the aim is not only the preservation of the data files (usually called bit preservation), but of the full environment a physicist would need to perform brand new analyses. Technically, a Virtual Machine approach has been chosen, using the VirtualBox platform. Concerning simulated events, the full chain from event generators to physics plots is possible, and reprocessing of data events is also functioning. Interactive tools like the DALI event display can be used on both data and simulated events. The Virtual Machine approach is suited for both interactive usage, and for massive computing using Cloud like approaches.

  5. A computational environment for creating and testing reduced chemical kinetic mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Montgomery, C.J.; Swensen, D.A.; Harding, T.V.; Cremer, M.A.; Bockelie, M.J. [Reaction Engineering International, Salt Lake City, UT (USA)

    2002-02-01

    This paper describes software called computer assisted reduced mechanism problem solving environment (CARM-PSE) that gives the engineer the ability to rapidly set up, run and examine large numbers of problems comparing detailed and reduced (approximate) chemistry. CARM-PSE integrates the automatic chemical mechanism reduction code CARM and the codes that simulate perfectly stirred reactors and plug flow reactors into a user-friendly computational environment. CARM-PSE gives the combustion engineer the ability to easily test chemical approximations over many hundreds of combinations of inputs in a multidimensional parameter space. The demonstration problems compare detailed and reduced chemical kinetic calculations for methane-air combustion, including nitrogen oxide formation, in a stirred reactor and selective non-catalytic reduction of NOx, in coal combustion flue gas.

  6. Offspring reaction norms shaped by parental environment: interaction between within- and trans-generational plasticity of inducible defenses.

    Science.gov (United States)

    Luquet, Emilien; Tariel, Juliette

    2016-10-12

    Within-generational plasticity (WGP) and transgenerational plasticity (TGP) are mechanisms allowing rapid adaptive responses to fluctuating environments without genetic change. These forms of plasticity have often been viewed as independent processes. Recent evidence suggests that WGP is altered by the environmental conditions experienced by previous generations (i.e., TGP). In the context of inducible defenses, one of the most studied cases of plasticity, the WGP x TGP interaction has been poorly investigated. We provide evidence that TGP can alter the reaction norms of inducible defenses in a freshwater snail. The WGP x TGP interaction patterns are trait-specific and lead to decreased slope of reaction norms (behaviour and shell thickness). Offspring from induced parents showed a higher predator avoidance behaviour and a thicker shell than snails from non-induced parents in no predator-cue environment while they reached similar defenses in predator-cue environment. The WGP x TGP interaction further lead to a switch from a plastic towards a constitutive expression of defenses for shell dimensions (flat reaction norm). WGP-alteration by TGP may shape the adaptive responses to environmental change and then has a substantial importance to understand the evolution of plasticity.

  7. Mathematical Language Development and Talk Types in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Symons, Duncan; Pierce, Robyn

    2015-01-01

    In this study we examine the use of cumulative and exploratory talk types in a year 5 computer supported collaborative learning environment. The focus for students in this environment was to participate in mathematical problem solving, with the intention of developing the proficiencies of problem solving and reasoning. Findings suggest that…

  8. ReaDDy--a software for particle-based reaction-diffusion dynamics in crowded cellular environments.

    Directory of Open Access Journals (Sweden)

    Johannes Schöneberg

    Full Text Available We introduce the software package ReaDDy for simulation of detailed spatiotemporal mechanisms of dynamical processes in the cell, based on reaction-diffusion dynamics with particle resolution. In contrast to other particle-based reaction kinetics programs, ReaDDy supports particle interaction potentials. This permits effects such as space exclusion, molecular crowding and aggregation to be modeled. The biomolecules simulated can be represented as a sphere, or as a more complex geometry such as a domain structure or polymer chain. ReaDDy bridges the gap between small-scale but highly detailed molecular dynamics or Brownian dynamics simulations and large-scale but little-detailed reaction kinetics simulations. ReaDDy has a modular design that enables the exchange of the computing core by efficient platform-specific implementations or dynamical models that are different from Brownian dynamics.

  9. Ubiquitous computing in shared-care environments.

    Science.gov (United States)

    Koch, S

    2006-07-01

    In light of future challenges, such as growing numbers of elderly, increase in chronic diseases, insufficient health care budgets and problems with staff recruitment for the health-care sector, information and communication technology (ICT) becomes a possible means to meet these challenges. Organizational changes such as the decentralization of the health-care system lead to a shift from in-hospital to both advanced and basic home health care. Advanced medical technologies provide solutions for distant home care in form of specialist consultations and home monitoring. Furthermore, the shift towards home health care will increase mobile work and the establishment of shared care teams which require ICT-based solutions that support ubiquitous information access and cooperative work. Clinical documentation and decision support systems are the main ICT-based solutions of interest in the context of ubiquitous computing for shared care environments. This paper therefore describes the prerequisites for clinical documentation and decision support at the point of care, the impact of mobility on the documentation process, and how the introduction of ICT-based solutions will influence organizations and people. Furthermore, the role of dentistry in shared-care environments is discussed and illustrated in the form of a future scenario.

  10. The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system

    Science.gov (United States)

    Zerkin, V. V.; Pritychenko, B.

    2018-04-01

    The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.

  11. Learning styles: individualizing computer-based learning environments

    Directory of Open Access Journals (Sweden)

    Tim Musson

    1995-12-01

    Full Text Available While the need to adapt teaching to the needs of a student is generally acknowledged (see Corno and Snow, 1986, for a wide review of the literature, little is known about the impact of individual learner-differences on the quality of learning attained within computer-based learning environments (CBLEs. What evidence there is appears to support the notion that individual differences have implications for the degree of success or failure experienced by students (Ford and Ford, 1992 and by trainee end-users of software packages (Bostrom et al, 1990. The problem is to identify the way in which specific individual characteristics of a student interact with particular features of a CBLE, and how the interaction affects the quality of the resultant learning. Teaching in a CBLE is likely to require a subset of teaching strategies different from that subset appropriate to more traditional environments, and the use of a machine may elicit different behaviours from those normally arising in a classroom context.

  12. Growth and Destruction of PAH Molecules in Reactions with Carbon Atoms

    Energy Technology Data Exchange (ETDEWEB)

    Krasnokutski, Serge A.; Huisken, Friedrich; Jäger, Cornelia; Henning, Thomas [Laboratory Astrophysics Group of the Max Planck Institute for Astronomy at the Friedrich Schiller University Jena, Helmholtzweg 3, D-07743 Jena (Germany)

    2017-02-10

    A very high abundance of atomic carbon in the interstellar medium (ISM), and the high reactivity of these species toward different hydrocarbon molecules including benzene, raise questions regarding the stability of polycyclic aromatic hydrocarbon (PAH) molecules in space. To test the efficiency of destruction of PAH molecules via reactions with atomic carbon, we performed a set of laboratory and computational studies of the reactions of naphthalene, anthracene, and coronene molecules with carbon atoms in the ground state. The reactions were investigated in liquid helium droplets at T = 0.37 K and by quantum chemical computations. Our studies suggest that all small and all large catacondensed PAHs react barrierlessly with atomic carbon, and therefore should be efficiently destroyed by such reactions in a broad temperature range. At the same time, large compact pericondensed PAHs should be more inert toward such a reaction. In addition, taking into account their higher photostability, much higher abundances of pericondensed PAHs should be expected in various astrophysical environments. The barrierless reactions between carbon atoms and small PAHs also suggest that, in the ISM, these reactions could lead to the bottom-up formation of PAH molecules.

  13. Dynamic Scaffolding of Socially Regulated Learning in a Computer-Based Learning Environment

    Science.gov (United States)

    Molenaar, Inge; Roda, Claudia; van Boxtel, Carla; Sleegers, Peter

    2012-01-01

    The aim of this study is to test the effects of dynamically scaffolding social regulation of middle school students working in a computer-based learning environment. Dyads in the scaffolding condition (N=56) are supported with computer-generated scaffolds and students in the control condition (N=54) do not receive scaffolds. The scaffolds are…

  14. Computational and Experimental Study of Thermodynamics of the Reaction of Titania and Water at High Temperatures.

    Science.gov (United States)

    Nguyen, Q N; Bauschlicher, C W; Myers, D L; Jacobson, N S; Opila, E J

    2017-12-14

    Gaseous titanium hydroxide and oxyhydroxide species were studied with quantum chemical methods. The results are used in conjunction with an experimental transpiration study of titanium dioxide (TiO 2 ) in water vapor-containing environments at elevated temperatures to provide a thermodynamic description of the Ti(OH) 4 (g) and TiO(OH) 2 (g) species. The geometry and harmonic vibrational frequencies of these species were computed using the coupled-cluster singles and doubles method with a perturbative correction for connected triple substitutions [CCSD(T)]. For the OH bending and rotation, the B3LYP density functional theory was used to compute corrections to the harmonic approximations. These results were combined to determine the enthalpy of formation. Experimentally, the transpiration method was used with water contents from 0 to 76 mol % in oxygen or argon carrier gases for 20-250 h exposure times at 1473-1673 K. Results indicate that oxygen is not a key contributor to volatilization, and the primary reaction for volatilization in this temperature range is TiO 2 (s) + H 2 O(g) = TiO(OH) 2 (g). Data were analyzed with both the second and third law methods using the thermal functions derived from the theoretical calculations. The third law enthalpy of formation at 298.15 K for TiO(OH) 2 (g) at 298 K was -838.9 ± 6.5 kJ/mol, which compares favorably to the theoretical calculation of -838.7 ± 25 kJ/mol. We recommend the experimentally derived third law enthalpy of formation at 298.15 K for TiO(OH) 2 , the computed entropy of 320.67 J/mol·K, and the computed heat capacity [149.192 + (-0.02539)T + (8.28697 × 10 -6 )T 2 + (-15614.05)/T + (-5.2182 × 10 -11 )/T 2 ] J/mol-K, where T is the temperature in K.

  15. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  16. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    Science.gov (United States)

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  17. The Needs of Virtual Machines Implementation in Private Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Edy Kristianto

    2015-12-01

    Full Text Available The Internet of Things (IOT becomes the purpose of the development of information and communication technology. Cloud computing has a very important role in supporting the IOT, because cloud computing allows to provide services in the form of infrastructure (IaaS, platform (PaaS, and Software (SaaS for its users. One of the fundamental services is infrastructure as a service (IaaS. This study analyzed the requirement that there must be based on a framework of NIST to realize infrastructure as a service in the form of a virtual machine to be built in a cloud computing environment.

  18. Computer simulation of the steam--graphite reaction under isothermal and steady-state conditions

    International Nuclear Information System (INIS)

    Joy, D.S.; Stem, S.C.

    1975-05-01

    A mathematical model was formulated to describe the isothermal, steady-state diffusion and reaction of steam in a graphite matrix. A generalized Langmuir-Hinshelwood equation is used to represent the steam-graphite reaction rate. The model also includes diffusion in the gas phase adjacent to the graphite matrix. A computer program, written to numerically integrate the resulting differential equations, is described. The coupled nonlinear differential equations in the graphite phase are solved using the IBM Continuous System Modeling Program. Classical finite difference techniques are used for the gas-phase calculations. An iterative procedure is required to couple the two sets of calculations. Several sample problems are presented to demonstrate the utility of the model. (U.S.)

  19. Multi-VO support in IHEP's distributed computing environment

    International Nuclear Information System (INIS)

    Yan, T; Suo, B; Zhao, X H; Zhang, X M; Ma, Z T; Yan, X F; Lin, T; Deng, Z Y; Li, W D; Belov, S; Pelevanyuk, I; Zhemchugov, A; Cai, H

    2015-01-01

    Inspired by the success of BESDIRAC, the distributed computing environment based on DIRAC for BESIII experiment, several other experiments operated by Institute of High Energy Physics (IHEP), such as Circular Electron Positron Collider (CEPC), Jiangmen Underground Neutrino Observatory (JUNO), Large High Altitude Air Shower Observatory (LHAASO) and Hard X-ray Modulation Telescope (HXMT) etc, are willing to use DIRAC to integrate the geographically distributed computing resources available by their collaborations. In order to minimize manpower and hardware cost, we extended the BESDIRAC platform to support multi-VO scenario, instead of setting up a self-contained distributed computing environment for each VO. This makes DIRAC as a service for the community of those experiments. To support multi-VO, the system architecture of BESDIRAC is adjusted for scalability. The VOMS and DIRAC servers are reconfigured to manage users and groups belong to several VOs. A lightweight storage resource manager StoRM is employed as the central SE to integrate local and grid data. A frontend system is designed for user's massive job splitting, submission and management, with plugins to support new VOs. A monitoring and accounting system is also considered to easy the system administration and VO related resources usage accounting. (paper)

  20. Blockchain-based database to ensure data integrity in cloud computing environments

    OpenAIRE

    Gaetani, Edoardo; Aniello, Leonardo; Baldoni, Roberto; Lombardi, Federico; Margheri, Andrea; Sassone, Vladimiro

    2017-01-01

    Data is nowadays an invaluable resource, indeed it guides all business decisions in most of the computer-aided human activities. Threats to data integrity are thus of paramount relevance, as tampering with data may maliciously affect crucial business decisions. This issue is especially true in cloud computing environments, where data owners cannot control fundamental data aspects, like the physical storage of data and the control of its accesses. Blockchain has recently emerged as a fascinati...

  1. NOSTOS: a paper-based ubiquitous computing healthcare environment to support data capture and collaboration.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Eriksson, Henrik

    2003-01-01

    In this paper, we present a new approach to clinical workplace computerization that departs from the window-based user interface paradigm. NOSTOS is an experimental computer-augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk-up displays, headsets, a smart desk, and sensors to enhance an existing paper-based practice with computer power. The physical interfaces allow clinicians to retain mobile paper-based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper-based clinical work environment.

  2. Computer investigations on the asymptotic behavior of the rate coefficient for the annihilation reaction A + A → product and the trapping reaction in three dimensions.

    Science.gov (United States)

    Litniewski, Marek; Gorecki, Jerzy

    2011-06-28

    We have performed intensive computer simulations of the irreversible annihilation reaction: A + A → C + C and of the trapping reaction: A + B → C + B for a variety of three-dimensional fluids composed of identical spherical particles. We have found a significant difference in the asymptotic behavior of the rate coefficients for these reactions. Both the rate coefficients converge to the same value with time t going to infinity but the convergence rate is different: the O(t(-1/2)) term for the annihilation reaction is higher than the corresponding term for the trapping reaction. The simulation results suggest that ratio of the terms is a universal quantity with the value equal to 2 or slightly above. A model for the annihilation reaction based on the superposition approximation predicts the difference in the O(t(-1/2)) terms, but overestimates the value for the annihilation reaction by about 30%. We have also performed simulations for the dimerization process: A + A → E, where E stands for a dimer. The dimerization decreases the reaction rate due to the decrease in the diffusion constant for A. The effect is successfully predicted by a simple model.

  3. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  4. Computing the Free Energy along a Reaction Coordinate Using Rigid Body Dynamics.

    Science.gov (United States)

    Tao, Peng; Sodt, Alexander J; Shao, Yihan; König, Gerhard; Brooks, Bernard R

    2014-10-14

    The calculations of potential of mean force along complex chemical reactions or rare events pathways are of great interest because of their importance for many areas in chemistry, molecular biology, and material science. The major difficulty for free energy calculations comes from the great computational cost for adequate sampling of the system in high-energy regions, especially close to the reaction transition state. Here, we present a method, called FEG-RBD, in which the free energy gradients were obtained from rigid body dynamics simulations. Then the free energy gradients were integrated along a reference reaction pathway to calculate free energy profiles. In a given system, the reaction coordinates defining a subset of atoms (e.g., a solute, or the quantum mechanics (QM) region of a quantum mechanics/molecular mechanics simulation) are selected to form a rigid body during the simulation. The first-order derivatives (gradients) of the free energy with respect to the reaction coordinates are obtained through the integration of constraint forces within the rigid body. Each structure along the reference reaction path is separately subjected to such a rigid body simulation. The individual free energy gradients are integrated along the reference pathway to obtain the free energy profile. Test cases provided demonstrate both the strengths and weaknesses of the FEG-RBD method. The most significant benefit of this method comes from the fast convergence rate of the free energy gradient using rigid-body constraints instead of restraints. A correction to the free energy due to approximate relaxation of the rigid-body constraint is estimated and discussed. A comparison with umbrella sampling using a simple test case revealed the improved sampling efficiency of FEG-RBD by a factor of 4 on average. The enhanced efficiency makes this method effective for calculating the free energy of complex chemical reactions when the reaction coordinate can be unambiguously defined by a

  5. Pulsed fusion space propulsion: Computational Magneto-Hydro Dynamics of a multi-coil parabolic reaction chamber

    Science.gov (United States)

    Romanelli, Gherardo; Mignone, Andrea; Cervone, Angelo

    2017-10-01

    Pulsed fusion propulsion might finally revolutionise manned space exploration by providing an affordable and relatively fast access to interplanetary destinations. However, such systems are still in an early development phase and one of the key areas requiring further investigations is the operation of the magnetic nozzle, the device meant to exploit the fusion energy and generate thrust. One of the last pulsed fusion magnetic nozzle design is the so called multi-coil parabolic reaction chamber: the reaction is thereby ignited at the focus of an open parabolic chamber, enclosed by a series of coaxial superconducting coils that apply a magnetic field. The field, beside confining the reaction and preventing any contact between hot fusion plasma and chamber structure, is also meant to reflect the explosion and push plasma out of the rocket. Reflection is attained thanks to electric currents induced in conductive skin layers that cover each of the coils, the change of plasma axial momentum generates thrust in reaction. This working principle has yet to be extensively verified and computational Magneto-Hydro Dynamics (MHD) is a viable option to achieve that. This work is one of the first detailed ideal-MHD analysis of a multi-coil parabolic reaction chamber of this kind and has been completed employing PLUTO, a freely distributed computational code developed at the Physics Department of the University of Turin. The results are thus a preliminary verification of the chamber's performance. Nonetheless, plasma leakage through the chamber structure has been highlighted. Therefore, further investigations are required to validate the chamber design. Implementing a more accurate physical model (e.g. Hall-MHD or relativistic-MHD) is thus mandatory, and PLUTO shows the capabilities to achieve that.

  6. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  7. Service ORiented Computing EnviRonment (SORCER) for Deterministic Global and Stochastic Optimization

    OpenAIRE

    Raghunath, Chaitra

    2015-01-01

    With rapid growth in the complexity of large scale engineering systems, the application of multidisciplinary analysis and design optimization (MDO) in the engineering design process has garnered much attention. MDO addresses the challenge of integrating several different disciplines into the design process. Primary challenges of MDO include computational expense and poor scalability. The introduction of a distributed, collaborative computational environment results in better...

  8. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    Science.gov (United States)

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  9. Computer Graphics Orientation and Training in a Corporate/Production Environment.

    Science.gov (United States)

    McDevitt, Marsha Jean

    This master's thesis provides an overview of a computer graphics production environment and proposes a realistic approach to orientation and on-going training for employees working within a fast-paced production schedule. Problems involved in meeting the training needs of employees are briefly discussed in the first chapter, while the second…

  10. Enhanced Survey and Proposal to secure the data in Cloud Computing Environment

    OpenAIRE

    MR.S.SUBBIAH; DR.S.SELVA MUTHUKUMARAN; DR.T.RAMKUMAR

    2013-01-01

    Cloud computing have the power to eliminate the cost of setting high end computing infrastructure. It is a promising area or design to give very flexible architecture, accessible through the internet. In the cloud computing environment the data will be reside at any of the data centers. Due to that, some data center may leak the data stored on there, beyond the reach and control of the users. For this kind of misbehaving data centers, the service providers should take care of the security and...

  11. Deception Detection in a Computer-Mediated Environment: Gender, Trust, and Training Issues

    National Research Council Canada - National Science Library

    Dziubinski, Monica

    2003-01-01

    .... This research draws on communication and deception literature to develop a conceptual model proposing relationships between deception detection abilities in a computer-mediated environment, gender, trust, and training...

  12. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  13. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs.

    Directory of Open Access Journals (Sweden)

    Qifan Kuang

    Full Text Available Early and accurate identification of adverse drug reactions (ADRs is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs.In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper.Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  14. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  15. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  16. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    Science.gov (United States)

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  17. Analysis of the potential geochemical reactions in the Enceladus' hydrothermal environment

    Science.gov (United States)

    Ramirez-Cabañas, A. K.; Flandes, A.

    2017-12-01

    Enceladus is the sixth largest moon of Saturn and differs from its other moons, because of its cryovolcanic geysers that emanate from its south pole. The instruments of the Cassini spacecraft reveal different compounds in the gases and the dust of the geysers, such as salts (sodium chloride, sodium bicarbonate and/or sodium carbonate), as well as silica traces (Postberg et al., 2008, 2009) that could be the result of a hydrothermal environment (Hsu et al., 2014, Sekine et al., 2014). By means of a thermodynamic analysis, we propose and evaluate potential geochemical reactions that could happen from the interaction between the nucleus surface and the inner ocean of Enceladus. These reactions may well lead to the origin of the compounds found in the geysers. From this analysis, we propose that, at least, two minerals must be present in the condritic nucleus of Enceladus: olivines (fayalite and fosterite) and feldspar (orthoclase and albite). Subsequently, taking as reference the hydrothermal processes that take place on Earth, we propose the different stages of a potential hydrothermal scenario for Enceladus.

  18. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    Science.gov (United States)

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  19. Method and system for rendering and interacting with an adaptable computing environment

    Science.gov (United States)

    Osbourn, Gordon Cecil [Albuquerque, NM; Bouchard, Ann Marie [Albuquerque, NM

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  20. Material interactions with the Low Earth Orbital (LEO) environment: Accurate reaction rate measurements

    Science.gov (United States)

    Visentine, James T.; Leger, Lubert J.

    1987-01-01

    To resolve uncertainties in estimated LEO atomic oxygen fluence and provide reaction product composition data for comparison to data obtained in ground-based simulation laboratories, a flight experiment has been proposed for the space shuttle which utilizes an ion-neutral mass spectrometer to obtain in-situ ambient density measurements and identify reaction products from modeled polymers exposed to the atomic oxygen environment. An overview of this experiment is presented and the methodology of calibrating the flight mass spectrometer in a neutral beam facility prior to its use on the space shuttle is established. The experiment, designated EOIM-3 (Evaluation of Oxygen Interactions with Materials, third series), will provide a reliable materials interaction data base for future spacecraft design and will furnish insight into the basic chemical mechanisms leading to atomic oxygen interactions with surfaces.

  1. Study on User Authority Management for Safe Data Protection in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Su-Hyun Kim

    2015-03-01

    Full Text Available In cloud computing environments, user data are encrypted using numerous distributed servers before storing such data. Global Internet service companies, such as Google and Yahoo, recognized the importance of Internet service platforms and conducted self-research and development to create and utilize large cluster-based cloud computing platform technology based on low-priced commercial nodes. As diverse data services become possible in distributed computing environments, high-capacity distributed management is emerging as a major issue. Meanwhile, because of the diverse forms of using high-capacity data, security vulnerability and privacy invasion by malicious attackers or internal users can occur. As such, when various sensitive data are stored in cloud servers and used from there, the problem of data spill might occur because of external attackers or the poor management of internal users. Data can be managed through encryption to prevent such problems. However, existing simple encryption methods involve problems associated with the management of access to data stored in cloud environments. Therefore, in the present paper, a technique for data access management by user authority, based on Attribute-Based Encryption (ABE and secret distribution techniques, is proposed.

  2. Ab initio computational study of reaction mechanism of peptide bond formation on HF/6-31G(d,p) level

    Science.gov (United States)

    Siahaan, P.; Lalita, M. N. T.; Cahyono, B.; Laksitorini, M. D.; Hildayani, S. Z.

    2017-02-01

    Peptide plays an important role in modulation of various cell functions. Therefore, formation reaction of the peptide is important for chemical reactions. One way to probe the reaction of peptide synthesis is a computational method. The purpose of this research is to determine the reaction mechanism for peptide bond formation on Ac-PV-NH2 and Ac-VP-NH2 synthesis from amino acid proline and valine by ab initio computational approach. The calculations were carried out by theory and basis set HF/6-31G(d,p) for four mechanisms (path 1 to 4) that proposed in this research. The results show that the highest of the rate determining step between reactant and transition state (TS) for path 1, 2, 3, and 4 are 163.06 kJ.mol-1, 1868 kJ.mol-1, 5685 kJ.mol-1, and 1837 kJ.mol-1. The calculation shows that the most preferred reaction of Ac-PV-NH2 and Ac-VP-NH2 synthesis from amino acid proline and valine are on the path 1 (initiated with the termination of H+ in proline amino acid) that produce Ac-PV-NH2.

  3. The Use of Computer Simulation to Compare Student performance in Traditional versus Distance Learning Environments

    Directory of Open Access Journals (Sweden)

    Retta Guy

    2015-06-01

    Full Text Available Simulations have been shown to be an effective tool in traditional learning environments; however, as distance learning grows in popularity, the need to examine simulation effectiveness in this environment has become paramount. A casual-comparative design was chosen for this study to determine whether students using a computer-based instructional simulation in hybrid and fully online environments learned better than traditional classroom learners. The study spans a period of 6 years beginning fall 2008 through spring 2014. The population studied was 281 undergraduate business students self-enrolled in a 200-level microcomputer application course. The overall results support previous studies in that computer simulations are most effective when used as a supplement to face-to-face lectures and in hybrid environments.

  4. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing. PMID:28467505

  5. Performance comparison of heuristic algorithms for task scheduling in IaaS cloud computing environment.

    Science.gov (United States)

    Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda

    2017-01-01

    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.

  6. Visual Reasoning in Computational Environment: A Case of Graph Sketching

    Science.gov (United States)

    Leung, Allen; Chan, King Wah

    2004-01-01

    This paper reports the case of a form six (grade 12) Hong Kong student's exploration of graph sketching in a computational environment. In particular, the student summarized his discovery in the form of two empirical laws. The student was interviewed and the interviewed data were used to map out a possible path of his visual reasoning. Critical…

  7. CH(+) Destruction by Reaction with H: Computing Quantum Rates To Model Different Molecular Regions in the Interstellar Medium.

    Science.gov (United States)

    Bovino, S; Grassi, T; Gianturco, F A

    2015-12-17

    A detailed analysis of an ionic reaction that plays a crucial role in the carbon chemistry of the interstellar medium (ISM) is carried out by computing ab initio reactive cross sections with a quantum method and by further obtaining the corresponding CH(+) destruction rates over a range of temperatures that shows good overall agreement with existing experiments. The differences found between all existing calculations and the very-low-T experiments are discussed and explored via a simple numerical model that links these cross section reductions to collinear approaches where nonadiabatic crossing is expected to dominate. The new rates are further linked to a complex chemical network that models the evolution of the CH(+) abundance in the photodissociation region (PDR) and molecular cloud (MC) environments of the ISM. The abundances of CH(+) are given by numerical solutions of a large set of coupled, first-order kinetics equations that employs our new chemical package krome. The analysis that we carry out reveals that the important region for CH(+) destruction is that above 100 K, hence showing that, at least for this reaction, the differences with the existing laboratory low-T experiments are of essentially no importance within the astrochemical environments discussed here because, at those temperatures, other chemical processes involving the title molecule are taking over. A detailed analysis of the chemical network involving CH(+) also shows that a slight decrease in the initial oxygen abundance might lead to higher CH(+) abundances because the main chemical carbon ion destruction channel is reduced in efficiency. This might provide an alternative chemical route to understand the reason why general astrochemical models fail when the observed CH(+) abundances are matched with the outcomes of their calculations.

  8. Method for Selection of Solvents for Promotion of Organic Reactions

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Jiménez-González, Concepción; Constable, David J.C.

    2005-01-01

    is to produce, for a given reaction, a short list of chemicals that could be considered as potential solvents, to evaluate their performance in the reacting system, and, based on this, to rank them according to a scoring system. Several examples of application are given to illustrate the main features and steps......A method to select appropriate green solvents for the promotion of a class of organic reactions has been developed. The method combines knowledge from industrial practice and physical insights with computer-aided property estimation tools for selection/design of solvents. In particular, it employs...... estimates of thermodynamic properties to generate a knowledge base of reaction, solvent and environment related properties that directly or indirectly influence the rate and/or conversion of a given reaction. Solvents are selected using a rules-based procedure where the estimated reaction-solvent properties...

  9. Quality control of computational fluid dynamics in indoor environments

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Nielsen, P. V.

    2003-01-01

    Computational fluid dynamics (CFD) is used routinely to predict air movement and distributions of temperature and concentrations in indoor environments. Modelling and numerical errors are inherent in such studies and must be considered when the results are presented. Here, we discuss modelling as...... the quality of CFD calculations, as well as guidelines for the minimum information that should accompany all CFD-related publications to enable a scientific judgment of the quality of the study....

  10. Tablet computers and eBooks. Unlocking the potential for personal learning environments?

    NARCIS (Netherlands)

    Kalz, Marco

    2012-01-01

    Kalz, M. (2012, 9 May). Tablet computers and eBooks. Unlocking the potential for personal learning environments? Invited presentation during the annual conference of the European Association for Distance Learning (EADL), Noordwijkerhout, The Netherlands.

  11. Nuclides.net: An integrated environment for computations on radionuclides and their radiation

    International Nuclear Information System (INIS)

    Galy, J.; Magill, J.

    2002-01-01

    Full text: The Nuclides.net computational package is of direct interest in the fields of environment monitoring and nuclear forensics. The 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclide chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The main emphasis in Nuclides.net is on nuclear science applications, such as health physics, radioprotection and radiochemistry, rather than nuclear data for which excellent sources already exist. In contrast to the CD-based Nuclides 2000 predecessor, Nuclides.net applications run over the internet on a web server. The user interface to these applications is via a web browser. Information submitted by the user is sent to the appropriate applications resident on the web server. The results of the calculations are returned to the user, again via the browser. The product is aimed at both students and professionals for reference data on radionuclides and computations based on this data using the latest internet technology. It is particularly suitable for educational purposes in the nuclear industry, health physics and radiation protection, nuclear and radiochemistry, nuclear physics, astrophysics, etc. The Nuclides.net software suite contains the following modules/features: a) A new user interface to view the nuclide charts (with zoom features). Additional nuclide charts are based on spin, parity, binding energy etc. b) There are five main applications: (1) 'Decay Engine' for decay calculations of numbers, masses, activities, dose rates, etc. of parent and daughters. (2) 'Dosimetry and Shielding' module allows the calculation of dose rates from both unshielded and shielded point sources. A choice of 10 shield materials is available. (3) 'Virtual Nuclides' allows the user to do decay and dosimetry and shielding calculations on mixtures of

  12. Design and study of parallel computing environment of Monte Carlo simulation for particle therapy planning using a public cloud-computing infrastructure

    International Nuclear Information System (INIS)

    Yokohama, Noriya

    2013-01-01

    This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost. (author)

  13. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  14. Synthesis of antimicrobial silver nanoparticles through a photomediated reaction in an aqueous environment.

    Science.gov (United States)

    Banasiuk, Rafał; Frackowiak, Joanna E; Krychowiak, Marta; Matuszewska, Marta; Kawiak, Anna; Ziabka, Magdalena; Lendzion-Bielun, Zofia; Narajczyk, Magdalena; Krolicka, Aleksandra

    2016-01-01

    A fast, economical, and reproducible method for nanoparticle synthesis has been developed in our laboratory. The reaction is performed in an aqueous environment and utilizes light emitted by commercially available 1 W light-emitting diodes (λ =420 nm) as the catalyst. This method does not require nanoparticle seeds or toxic chemicals. The irradiation process is carried out for a period of up to 10 minutes, significantly reducing the time required for synthesis as well as environmental impact. By modulating various reaction parameters silver nanoparticles were obtained, which were predominantly either spherical or cubic. The produced nanoparticles demonstrated strong antimicrobial activity toward the examined bacterial strains. Additionally, testing the effect of silver nanoparticles on the human keratinocyte cell line and human peripheral blood mononuclear cells revealed that their cytotoxicity may be limited by modulating the employed concentrations of nanoparticles.

  15. Ultrafast and scalable cone-beam CT reconstruction using MapReduce in a cloud computing environment.

    Science.gov (United States)

    Meng, Bowen; Pratx, Guillem; Xing, Lei

    2011-12-01

    Four-dimensional CT (4DCT) and cone beam CT (CBCT) are widely used in radiation therapy for accurate tumor target definition and localization. However, high-resolution and dynamic image reconstruction is computationally demanding because of the large amount of data processed. Efficient use of these imaging techniques in the clinic requires high-performance computing. The purpose of this work is to develop a novel ultrafast, scalable and reliable image reconstruction technique for 4D CBCT∕CT using a parallel computing framework called MapReduce. We show the utility of MapReduce for solving large-scale medical physics problems in a cloud computing environment. In this work, we accelerated the Feldcamp-Davis-Kress (FDK) algorithm by porting it to Hadoop, an open-source MapReduce implementation. Gated phases from a 4DCT scans were reconstructed independently. Following the MapReduce formalism, Map functions were used to filter and backproject subsets of projections, and Reduce function to aggregate those partial backprojection into the whole volume. MapReduce automatically parallelized the reconstruction process on a large cluster of computer nodes. As a validation, reconstruction of a digital phantom and an acquired CatPhan 600 phantom was performed on a commercial cloud computing environment using the proposed 4D CBCT∕CT reconstruction algorithm. Speedup of reconstruction time is found to be roughly linear with the number of nodes employed. For instance, greater than 10 times speedup was achieved using 200 nodes for all cases, compared to the same code executed on a single machine. Without modifying the code, faster reconstruction is readily achievable by allocating more nodes in the cloud computing environment. Root mean square error between the images obtained using MapReduce and a single-threaded reference implementation was on the order of 10(-7). Our study also proved that cloud computing with MapReduce is fault tolerant: the reconstruction completed

  16. Coupled enzyme reactions performed in heterogeneous reaction media: experiments and modeling for glucose oxidase and horseradish peroxidase in a PEG/citrate aqueous two-phase system.

    Science.gov (United States)

    Aumiller, William M; Davis, Bradley W; Hashemian, Negar; Maranas, Costas; Armaou, Antonios; Keating, Christine D

    2014-03-06

    The intracellular environment in which biological reactions occur is crowded with macromolecules and subdivided into microenvironments that differ in both physical properties and chemical composition. The work described here combines experimental and computational model systems to help understand the consequences of this heterogeneous reaction media on the outcome of coupled enzyme reactions. Our experimental model system for solution heterogeneity is a biphasic polyethylene glycol (PEG)/sodium citrate aqueous mixture that provides coexisting PEG-rich and citrate-rich phases. Reaction kinetics for the coupled enzyme reaction between glucose oxidase (GOX) and horseradish peroxidase (HRP) were measured in the PEG/citrate aqueous two-phase system (ATPS). Enzyme kinetics differed between the two phases, particularly for the HRP. Both enzymes, as well as the substrates glucose and H2O2, partitioned to the citrate-rich phase; however, the Amplex Red substrate necessary to complete the sequential reaction partitioned strongly to the PEG-rich phase. Reactions in ATPS were quantitatively described by a mathematical model that incorporated measured partitioning and kinetic parameters. The model was then extended to new reaction conditions, i.e., higher enzyme concentration. Both experimental and computational results suggest mass transfer across the interface is vital to maintain the observed rate of product formation, which may be a means of metabolic regulation in vivo. Although outcomes for a specific system will depend on the particulars of the enzyme reactions and the microenvironments, this work demonstrates how coupled enzymatic reactions in complex, heterogeneous media can be understood in terms of a mathematical model.

  17. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user's guide, and related documentation (Version 7.0)

    International Nuclear Information System (INIS)

    Wolery, T.J.; Daveler, S.A.

    1992-01-01

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ''single-point'' thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics

  18. Using the CAVE virtual-reality environment as an aid to 3-D electromagnetic field computation

    International Nuclear Information System (INIS)

    Turner, L.R.; Levine, D.; Huang, M.; Papka, M.

    1995-01-01

    One of the major problems in three-dimensional (3-D) field computation is visualizing the resulting 3-D field distributions. A virtual-reality environment, such as the CAVE, (CAVE Automatic Virtual Environment) is helping to overcome this problem, thus making the results of computation more usable for designers and users of magnets and other electromagnetic devices. As a demonstration of the capabilities of the CAVE, the elliptical multipole wiggler (EMW), an insertion device being designed for the Advanced Photon Source (APS) now being commissioned at Argonne National Laboratory (ANL), wa made visible, along with its fields and beam orbits. Other uses of the CAVE in preprocessing and postprocessing computation for electromagnetic applications are also discussed

  19. Characteristics of Israeli School Teachers in Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Noga Magen-Nagar

    2013-01-01

    Full Text Available The purpose of this research is to investigate whether there are differences in the level of computer literacy, the amount of implementation of ICT in teaching and learning-assessment processes and the attitudes of teachers from computerized schools in comparison to teachers in non-computerized schools. In addition, the research investigates the characteristics of Israeli school teachers in a 21st century computer-based learning environment. A quantitative research methodology was used. The research sample included 811 elementary school teachers from the Jewish sector of whom 402 teachers were from the computerized school sample and 409 were teachers from the non-computerized school sample. The research findings show that teachers from the computerized school sample are more familiar with ICT, tend to use ICT more and have a more positive attitude towards ICT than teachers in the non-computerized school sample. The main conclusion which can be drawn from this research is that positive attitudes of teachers towards ICT are not sufficient for the integration of technology to occur. Future emphasis on new teaching skills of collective Technological Pedagogical Content Knowledge is necessary to promote the implementation of optimal pedagogy in innovative environments.

  20. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  1. History Matching in Parallel Computational Environments

    Energy Technology Data Exchange (ETDEWEB)

    Steven Bryant; Sanjay Srinivasan; Alvaro Barrera; Sharad Yadav

    2005-10-01

    A novel methodology for delineating multiple reservoir domains for the purpose of history matching in a distributed computing environment has been proposed. A fully probabilistic approach to perturb permeability within the delineated zones is implemented. The combination of robust schemes for identifying reservoir zones and distributed computing significantly increase the accuracy and efficiency of the probabilistic approach. The information pertaining to the permeability variations in the reservoir that is contained in dynamic data is calibrated in terms of a deformation parameter rD. This information is merged with the prior geologic information in order to generate permeability models consistent with the observed dynamic data as well as the prior geology. The relationship between dynamic response data and reservoir attributes may vary in different regions of the reservoir due to spatial variations in reservoir attributes, well configuration, flow constrains etc. The probabilistic approach then has to account for multiple r{sub D} values in different regions of the reservoir. In order to delineate reservoir domains that can be characterized with different rD parameters, principal component analysis (PCA) of the Hessian matrix has been done. The Hessian matrix summarizes the sensitivity of the objective function at a given step of the history matching to model parameters. It also measures the interaction of the parameters in affecting the objective function. The basic premise of PC analysis is to isolate the most sensitive and least correlated regions. The eigenvectors obtained during the PCA are suitably scaled and appropriate grid block volume cut-offs are defined such that the resultant domains are neither too large (which increases interactions between domains) nor too small (implying ineffective history matching). The delineation of domains requires calculation of Hessian, which could be computationally costly and as well as restricts the current approach to

  2. How computational methods and relativistic effects influence the study of chemical reactions involving Ru-NO complexes?

    Science.gov (United States)

    Orenha, Renato Pereira; Santiago, Régis Tadeu; Haiduke, Roberto Luiz Andrade; Galembeck, Sérgio Emanuel

    2017-05-05

    Two treatments of relativistic effects, namely effective core potentials (ECP) and all-electron scalar relativistic effects (DKH2), are used to obtain geometries and chemical reaction energies for a series of ruthenium complexes in B3LYP/def2-TZVP calculations. Specifically, the reaction energies of reduction (A-F), isomerization (G-I), and Cl - negative trans influence in relation to NH 3 (J-L) are considered. The ECP and DKH2 approaches provided geometric parameters close to experimental data and the same ordering for energy changes of reactions A-L. From geometries optimized with ECP, the electronic energies are also determined by means of the same ECP and basis set combined with the computational methods: MP2, M06, BP86, and its derivatives, so as B2PLYP, LC-wPBE, and CCSD(T) (reference method). For reactions A-I, B2PLYP provides the best agreement with CCSD(T) results. Additionally, B3LYP gave the smallest error for the energies of reactions J-L. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Research on elastic resource management for multi-queue under cloud computing environment

    Science.gov (United States)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  4. A mixed-methods exploration of an environment for learning computer programming

    Directory of Open Access Journals (Sweden)

    Richard Mather

    2015-08-01

    Full Text Available A mixed-methods approach is evaluated for exploring collaborative behaviour, acceptance and progress surrounding an interactive technology for learning computer programming. A review of literature reveals a compelling case for using mixed-methods approaches when evaluating technology-enhanced-learning environments. Here, ethnographic approaches used for the requirements engineering of computing systems are combined with questionnaire-based feedback and skill tests. These are applied to the ‘Ceebot’ animated 3D learning environment. Video analysis with workplace observation allowed detailed inspection of problem solving and tacit behaviours. Questionnaires and knowledge tests provided broad sample coverage with insights into subject understanding and overall response to the learning environment. Although relatively low scores in programming tests seemingly contradicted the perception that Ceebot had enhanced understanding of programming, this perception was nevertheless found to be correlated with greater test performance. Video analysis corroborated findings that the learning environment and Ceebot animations were engaging and encouraged constructive collaborative behaviours. Ethnographic observations clearly captured Ceebot's value in providing visual cues for problem-solving discussions and for progress through sharing discoveries. Notably, performance in tests was most highly correlated with greater programming practice (p≤0.01. It was apparent that although students had appropriated technology for collaborative working and benefitted from visual and tacit cues provided by Ceebot, they had not necessarily deeply learned the lessons intended. The key value of the ‘mixed-methods’ approach was that ethnographic observations captured the authenticity of learning behaviours, and thereby strengthened confidence in the interpretation of questionnaire and test findings.

  5. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  6. Minisatellite Attitude Guidance Using Reaction Wheels

    Directory of Open Access Journals (Sweden)

    Ion STROE

    2015-06-01

    Full Text Available In a previous paper [2], the active torques needed for the minisatellite attitude guidance from one fixed attitude posture to another fixed attitude posture were determined using an inverse dynamics method. But when considering reaction/momentum wheels, instead of this active torques computation, the purpose is to compute the angular velocities of the three reaction wheels which ensure the minisatellite to rotate from the initial to the final attitude. This paper presents this computation of reaction wheels angular velocities using a similar inverse dynamics method based on inverting Euler’s equations of motion for a rigid body with one fixed point, written in the framework of the x-y-z sequence of rotations parameterization. For the particular case A=B not equal C of an axisymmetric minisatellite, the two computations are compared: the active torques computation versus the computation of reaction wheels angular velocities ̇x , ̇y and ̇z. An interesting observation comes out from this numerical study: if the three reaction wheels are identical (with Iw the moment of inertia of one reaction wheel with respect to its central axis, then the evolutions in time of the products between Iw and the derivatives of the reaction wheels angular velocities, i.e. ̇ , ̇ and ̇ remain the same and do not depend on the moment of inertia Iw.

  7. Proposed Network Intrusion Detection System ‎Based on Fuzzy c Mean Algorithm in Cloud ‎Computing Environment

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Nowadays cloud computing had become is an integral part of IT industry, cloud computing provides Working environment allow a user of environmental to share data and resources over the internet. Where cloud computing its virtual grouping of resources offered over the internet, this lead to different matters related to the security and privacy in cloud computing. And therefore, create intrusion detection very important to detect outsider and insider intruders of cloud computing with high detection rate and low false positive alarm in the cloud environment. This work proposed network intrusion detection module using fuzzy c mean algorithm. The kdd99 dataset used for experiments .the proposed system characterized by a high detection rate with low false positive alarm

  8. Golden rule kinetics of transfer reactions in condensed phase: The microscopic model of electron transfer reactions in disordered solid matrices

    International Nuclear Information System (INIS)

    Basilevsky, M. V.; Mitina, E. A.; Odinokov, A. V.; Titov, S. V.

    2013-01-01

    The algorithm for a theoretical calculation of transfer reaction rates for light quantum particles (i.e., the electron and H-atom transfers) in non-polar solid matrices is formulated and justified. The mechanism postulated involves a local mode (an either intra- or inter-molecular one) serving as a mediator which accomplishes the energy exchange between the reacting high-frequency quantum mode and the phonon modes belonging to the environment. This approach uses as a background the Fermi golden rule beyond the usually applied spin-boson approximation. The dynamical treatment rests on the one-dimensional version of the standard quantum relaxation equation for the reduced density matrix, which describes the frequency fluctuation spectrum for the local mode under consideration. The temperature dependence of a reaction rate is controlled by the dimensionless parameter ξ 0 =ℏω 0 /k B T where ω 0 is the frequency of the local mode and T is the temperature. The realization of the computational scheme is different for the high/intermediate (ξ 0 0 ≫ 1) temperature ranges. For the first (quasi-classical) kinetic regime, the Redfield approximation to the solution of the relaxation equation proved to be sufficient and efficient in practical applications. The study of the essentially quantum-mechanical low-temperature kinetic regime in its asymptotic limit requires the implementation of the exact relaxation equation. The coherent mechanism providing a non-vanishing reaction rate has been revealed when T→ 0. An accurate computational methodology for the cross-over kinetic regime needs a further elaboration. The original model of the hopping mechanism for electronic conduction in photosensitive organic materials is considered, based on the above techniques. The electron transfer (ET) in active centers of such systems proceeds via local intra- and intermolecular modes. The active modes, as a rule, operate beyond the kinetic regimes, which are usually postulated in the

  9. Golden rule kinetics of transfer reactions in condensed phase: the microscopic model of electron transfer reactions in disordered solid matrices.

    Science.gov (United States)

    Basilevsky, M V; Odinokov, A V; Titov, S V; Mitina, E A

    2013-12-21

    The algorithm for a theoretical calculation of transfer reaction rates for light quantum particles (i.e., the electron and H-atom transfers) in non-polar solid matrices is formulated and justified. The mechanism postulated involves a local mode (an either intra- or inter-molecular one) serving as a mediator which accomplishes the energy exchange between the reacting high-frequency quantum mode and the phonon modes belonging to the environment. This approach uses as a background the Fermi golden rule beyond the usually applied spin-boson approximation. The dynamical treatment rests on the one-dimensional version of the standard quantum relaxation equation for the reduced density matrix, which describes the frequency fluctuation spectrum for the local mode under consideration. The temperature dependence of a reaction rate is controlled by the dimensionless parameter ξ0 = ℏω0/k(B)T where ω0 is the frequency of the local mode and T is the temperature. The realization of the computational scheme is different for the high/intermediate (ξ0 regime, the Redfield approximation to the solution of the relaxation equation proved to be sufficient and efficient in practical applications. The study of the essentially quantum-mechanical low-temperature kinetic regime in its asymptotic limit requires the implementation of the exact relaxation equation. The coherent mechanism providing a non-vanishing reaction rate has been revealed when T → 0. An accurate computational methodology for the cross-over kinetic regime needs a further elaboration. The original model of the hopping mechanism for electronic conduction in photosensitive organic materials is considered, based on the above techniques. The electron transfer (ET) in active centers of such systems proceeds via local intra- and intermolecular modes. The active modes, as a rule, operate beyond the kinetic regimes, which are usually postulated in the existing theories of the ET. Our alternative dynamic ET model for local

  10. General method and thermodynamic tables for computation of equilibrium composition and temperature of chemical reactions

    Science.gov (United States)

    Huff, Vearl N; Gordon, Sanford; Morrell, Virginia E

    1951-01-01

    A rapidly convergent successive approximation process is described that simultaneously determines both composition and temperature resulting from a chemical reaction. This method is suitable for use with any set of reactants over the complete range of mixture ratios as long as the products of reaction are ideal gases. An approximate treatment of limited amounts of liquids and solids is also included. This method is particularly suited to problems having a large number of products of reaction and to problems that require determination of such properties as specific heat or velocity of sound of a dissociating mixture. The method presented is applicable to a wide variety of problems that include (1) combustion at constant pressure or volume; and (2) isentropic expansion to an assigned pressure, temperature, or Mach number. Tables of thermodynamic functions needed with this method are included for 42 substances for convenience in numerical computations.

  11. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  12. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  13. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers’ Touch-Interface User Experiences

    OpenAIRE

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users’ shopping behavior. In this research, I examine the underlying mechanisms between input device environments and shoppers’ decision-making processes. In particular, I investigate the impact of input d...

  14. KeyWare: an open wireless distributed computing environment

    Science.gov (United States)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  15. ComputerApplications and Virtual Environments (CAVE)

    Science.gov (United States)

    1993-01-01

    Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.

  16. Acoustic radiosity for computation of sound fields in diffuse environments

    Science.gov (United States)

    Muehleisen, Ralph T.; Beamer, C. Walter

    2002-05-01

    The use of image and ray tracing methods (and variations thereof) for the computation of sound fields in rooms is relatively well developed. In their regime of validity, both methods work well for prediction in rooms with small amounts of diffraction and mostly specular reflection at the walls. While extensions to the method to include diffuse reflections and diffraction have been made, they are limited at best. In the fields of illumination and computer graphics the ray tracing and image methods are joined by another method called luminous radiative transfer or radiosity. In radiosity, an energy balance between surfaces is computed assuming diffuse reflection at the reflective surfaces. Because the interaction between surfaces is constant, much of the computation required for sound field prediction with multiple or moving source and receiver positions can be reduced. In acoustics the radiosity method has had little attention because of the problems of diffraction and specular reflection. The utility of radiosity in acoustics and an approach to a useful development of the method for acoustics will be presented. The method looks especially useful for sound level prediction in industrial and office environments. [Work supported by NSF.

  17. MOO: Using a Computer Gaming Environment to Teach about Community Arts

    Science.gov (United States)

    Garber, Elizabeth

    2004-01-01

    In this paper, the author discusses the use of an interactive computer technology, "MOO" (Multi-user domain, Object-Oriented), in her art education classes for preservice teachers. A MOO is a text-based environment wherein interactivity is centered on text exchanges made between users based on problems or other materials created by teachers. The…

  18. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  19. Towards the Automatic Detection of Efficient Computing Assets in a Heterogeneous Cloud Environment

    OpenAIRE

    Iglesias, Jesus Omana; Stokes, Nicola; Ventresque, Anthony; Murphy, Liam, B.E.; Thorburn, James

    2013-01-01

    peer-reviewed In a heterogeneous cloud environment, the manual grading of computing assets is the first step in the process of configuring IT infrastructures to ensure optimal utilization of resources. Grading the efficiency of computing assets is however, a difficult, subjective and time consuming manual task. Thus, an automatic efficiency grading algorithm is highly desirable. In this paper, we compare the effectiveness of the different criteria used in the manual gr...

  20. Computational study of chain transfer to monomer reactions in high-temperature polymerization of alkyl acrylates.

    Science.gov (United States)

    Moghadam, Nazanin; Liu, Shi; Srinivasan, Sriraj; Grady, Michael C; Soroush, Masoud; Rappe, Andrew M

    2013-03-28

    This article presents a computational study of chain transfer to monomer (CTM) reactions in self-initiated high-temperature homopolymerization of alkyl acrylates (methyl, ethyl, and n-butyl acrylate). Several mechanisms of CTM are studied. The effects of the length of live polymer chains and the type of monoradical that initiated the live polymer chains on the energy barriers and rate constants of the involved reaction steps are investigated theoretically. All calculations are carried out using density functional theory. Three types of hybrid functionals (B3LYP, X3LYP, and M06-2X) and four basis sets (6-31G(d), 6-31G(d,p), 6-311G(d), and 6-311G(d,p)) are applied to predict the molecular geometries of the reactants, products and transition sates, and energy barriers. Transition state theory is used to estimate rate constants. The results indicate that abstraction of a hydrogen atom (by live polymer chains) from the methyl group in methyl acrylate, the methylene group in ethyl acrylate, and methylene groups in n-butyl acrylate are the most likely mechanisms of CTM. Also, the rate constants of CTM reactions calculated using M06-2X are in good agreement with those estimated from polymer sample measurements using macroscopic mechanistic models. The rate constant values do not change significantly with the length of live polymer chains. Abstraction of a hydrogen atom by a tertiary radical has a higher energy barrier than abstraction by a secondary radical, which agrees with experimental findings. The calculated and experimental NMR spectra of dead polymer chains produced by CTM reactions are comparable. This theoretical/computational study reveals that CTM occurs most likely via hydrogen abstraction by live polymer chains from the methyl group of methyl acrylate and methylene group(s) of ethyl (n-butyl) acrylate.

  1. SWAAM-LT: The long-term, sodium/water reaction analysis method computer code

    International Nuclear Information System (INIS)

    Shin, Y.W.; Chung, H.H.; Wiedermann, A.H.; Tanabe, H.

    1993-01-01

    The SWAAM-LT Code, developed for analysis of long-term effects of sodium/water reactions, is discussed. The theoretical formulation of the code is described, including the introduction of system matrices for ease of computer programming as a general system code. Also, some typical results of the code predictions for available large scale tests are presented. Test data for the steam generator design with the cover-gas feature and without the cover-gas feature are available and analyzed. The capabilities and limitations of the code are then discussed in light of the comparison between the code prediction and the test data

  2. DCHAIN: A user-friendly computer program for radioactive decay and reaction chain calculations

    International Nuclear Information System (INIS)

    East, L.V.

    1994-05-01

    A computer program for calculating the time-dependent daughter populations in radioactive decay and nuclear reaction chains is described. Chain members can have non-zero initial populations and be produced from the preceding chain member as the result of radioactive decay, a nuclear reaction, or both. As presently implemented, chains can contain up to 15 members. Program input can be supplied interactively or read from ASCII data files. Time units for half-lives, etc. can be specified during data entry. Input values are verified and can be modified if necessary, before used in calculations. Output results can be saved in ASCII files in a format suitable for including in reports or other documents. The calculational method, described in some detail, utilizes a generalized form of the Bateman equations. The program is written in the C language in conformance with current ANSI standards and can be used on multiple hardware platforms

  3. Radiolytic oxidation of propane: Computer modeling of the reaction scheme

    Science.gov (United States)

    Gupta, Avinash K.; Hanrahan, Robert J.

    The oxidation of gaseous propane under gamma radiolysis was studied at 100 torr pressure and 25°C, at oxygen pressures from 1 to 15 torr. Major oxygen-containing products and their G-values with 10% added oxygen are as follows: acetone, 0.98; i-propyl alcohol, 0.86; propionaldehyde, 0.43; n-propyl alcohol, 0.11; acrolein, 0.14; and allyl alcohol, 0.038. Minor products include i-butyl alcohol, t-amyl alcohol, n-butyl alcohol, n-amyl alcohol, and i-amyl alcohol. Small yields of i-hexyl alcohol and n-hexyl alcohol were also observed. There was no apparent difference in the G-values at pressures of 50, 100 and 150 torr. When the oxygen concentration was decreased below 5%, the yields of acetone, i-propyl alcohol, and n-propyl alcohol increased, the propionaldehyde yield decreased, and the yields of other products remained constant. The formation of major oxygen-containing products was explained on the basis that the alkyl radicals combine with molecular oxygen to give peroxyl radicals; the peroxyl radicals react with one another to give alkoxyl radicals, which in turn react with one another to form carbonyl compounds and alcohols. The reaction scheme for the formation of major products was examined using computer modeling based on a mechanism involving 28 reactions. Yields could be brought into agreement with the data within experimental error in nearly all cases.

  4. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Kulesz, James J [ORNL; Abercrombie, Robert K [ORNL; Kruse, Kara L [ORNL

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  5. Evaluating Students' Perceptions and Attitudes toward Computer-Mediated Project-Based Learning Environment: A Case Study

    Science.gov (United States)

    Seet, Ling Ying Britta; Quek, Choon Lang

    2010-01-01

    This research investigated 68 secondary school students' perceptions of their computer-mediated project-based learning environment and their attitudes towards Project Work (PW) using two instruments--Project Work Classroom Learning Environment Questionnaire (PWCLEQ) and Project Work Related Attitudes Instrument (PWRAI). In this project-based…

  6. A scalable computational framework for establishing long-term behavior of stochastic reaction networks.

    Directory of Open Access Journals (Sweden)

    Ankit Gupta

    2014-06-01

    Full Text Available Reaction networks are systems in which the populations of a finite number of species evolve through predefined interactions. Such networks are found as modeling tools in many biological disciplines such as biochemistry, ecology, epidemiology, immunology, systems biology and synthetic biology. It is now well-established that, for small population sizes, stochastic models for biochemical reaction networks are necessary to capture randomness in the interactions. The tools for analyzing such models, however, still lag far behind their deterministic counterparts. In this paper, we bridge this gap by developing a constructive framework for examining the long-term behavior and stability properties of the reaction dynamics in a stochastic setting. In particular, we address the problems of determining ergodicity of the reaction dynamics, which is analogous to having a globally attracting fixed point for deterministic dynamics. We also examine when the statistical moments of the underlying process remain bounded with time and when they converge to their steady state values. The framework we develop relies on a blend of ideas from probability theory, linear algebra and optimization theory. We demonstrate that the stability properties of a wide class of biological networks can be assessed from our sufficient theoretical conditions that can be recast as efficient and scalable linear programs, well-known for their tractability. It is notably shown that the computational complexity is often linear in the number of species. We illustrate the validity, the efficiency and the wide applicability of our results on several reaction networks arising in biochemistry, systems biology, epidemiology and ecology. The biological implications of the results as well as an example of a non-ergodic biological network are also discussed.

  7. A computational study of the Diels-Alder reactions between 2,3-dibromo-1,3-butadiene and maleic anhydride

    Science.gov (United States)

    Rivero, Uxía; Meuwly, Markus; Willitsch, Stefan

    2017-09-01

    The neutral and cationic Diels-Alder-type reactions between 2,3-dibromo-1,3-butadiene and maleic anhydride have been computationally explored as the first step of a combined experimental and theoretical study. Density functional theory calculations show that the neutral reaction is concerted while the cationic reaction can be either concerted or stepwise. Further isomerizations of the Diels-Alder products have been studied in order to predict possible fragmentation pathways in gas-phase experiments. Rice-Ramsperger-Kassel-Marcus (RRKM) calculations suggest that under single-collision experimental conditions the neutral product may reform the reactants and the cationic product will most likely eliminate CO2.

  8. Ozone initiated reactions and human comfort in indoor environments

    DEFF Research Database (Denmark)

    Tamas, Gyöngyi

    2006-01-01

    Chemical reactions between ozone and pollutants commonly found indoors have been suggested to cause adverse health and comfort effects among building occupants. Of special interest are reactions with terpenes and other pollutants containing unsaturated carbon-carbon bonds that are fast enough...... to occur under normal conditions in various indoor settings. These reactions are known to occur both in the gas phase (homogeneous reactions) and on the surfaces of building materials (heterogeneous reactions), producing a number of compounds that can be orders of magnitude more odorous and irritating than...... their precursors. The present thesis investigates the effects of ozone-initiated reactions with limonene and with various interior surfaces, including those associated with people, on short-term sensory responses. The evaluations were conducted using a perceived air quality (PAQ) method introduced by Fanger (1988...

  9. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  10. Golden rule kinetics of transfer reactions in condensed phase: The microscopic model of electron transfer reactions in disordered solid matrices

    Science.gov (United States)

    Basilevsky, M. V.; Odinokov, A. V.; Titov, S. V.; Mitina, E. A.

    2013-12-01

    The algorithm for a theoretical calculation of transfer reaction rates for light quantum particles (i.e., the electron and H-atom transfers) in non-polar solid matrices is formulated and justified. The mechanism postulated involves a local mode (an either intra- or inter-molecular one) serving as a mediator which accomplishes the energy exchange between the reacting high-frequency quantum mode and the phonon modes belonging to the environment. This approach uses as a background the Fermi golden rule beyond the usually applied spin-boson approximation. The dynamical treatment rests on the one-dimensional version of the standard quantum relaxation equation for the reduced density matrix, which describes the frequency fluctuation spectrum for the local mode under consideration. The temperature dependence of a reaction rate is controlled by the dimensionless parameter ξ0 = ℏω0/kBT where ω0 is the frequency of the local mode and T is the temperature. The realization of the computational scheme is different for the high/intermediate (ξ0 conduction in photosensitive organic materials is considered, based on the above techniques. The electron transfer (ET) in active centers of such systems proceeds via local intra- and intermolecular modes. The active modes, as a rule, operate beyond the kinetic regimes, which are usually postulated in the existing theories of the ET. Our alternative dynamic ET model for local modes immersed in the continuum harmonic medium is formulated for both classical and quantum regimes, and accounts explicitly for the mode/medium interaction. The kinetics of the energy exchange between the local ET subsystem and the surrounding environment essentially determine the total ET rate. The efficient computer code for rate computations is elaborated on. The computations are available for a wide range of system parameters, such as the temperature, external field, local mode frequency, and characteristics of mode/medium interaction. The relation of the

  11. COMPUTATIONAL MODELS USED FOR MINIMIZING THE NEGATIVE IMPACT OF ENERGY ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Oprea D.

    2012-04-01

    Full Text Available Optimizing energy system is a problem that is extensively studied for many years by scientists. This problem can be studied from different views and using different computer programs. The work is characterized by one of the following calculation methods used in Europe for modelling, power system optimization. This method shall be based on reduce action of energy system on environment. Computer program used and characterized in this article is GEMIS.

  12. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  13. Computational methodology of sodium-water reaction phenomenon in steam generator of sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira; Uchibori, Akihiro; Ohshima, Hiroyuki

    2009-01-01

    A new computational methodology of sodium-water reaction (SWR), which occurs in a steam generator of a liquid-sodium-cooled fast reactor when a heat transfer tube in the steam generator fails, has been developed considering multidimensional and multiphysics thermal hydraulics. Two kinds of reaction models are proposed in accordance with a phase of sodium as a reactant. One is the surface reaction model in which water vapor reacts directly with liquid sodium at the interface between the liquid sodium and the water vapor. The reaction heat will lead to a vigorous evaporation of liquid sodium, resulting in a reaction of gas-phase sodium. This is designated as the gas-phase reaction model. These two models are coupled with a multidimensional, multicomponent gas, and multiphase thermal hydraulics simulation method with compressibility (named the 'SERAPHIM' code). Using the present methodology, a numerical investigation of the SWR under a pin-bundle configuration (a benchmark analysis of the SWAT-1R experiment) has been carried out. As a result, the maximum gas temperature of approximately 1,300degC is predicted stably, which lies within the range of previous experimental observations. It is also demonstrated that the maximum temperature of the mass weighted average in the analysis agrees reasonably well with the experimental result measured by thermocouples. The present methodology will be promising to establish a theoretical and mechanical modeling of secondary failure propagation of heat transfer tubes due to such as an overheating rupture and a wastage. (author)

  14. A Novel Biometric Approach for Authentication In Pervasive Computing Environments

    OpenAIRE

    Rachappa,; Divyajyothi M G; D H Rao

    2016-01-01

    The paradigm of embedding computing devices in our surrounding environment has gained more interest in recent days. Along with contemporary technology comes challenges, the most important being the security and privacy aspect. Keeping the aspect of compactness and memory constraints of pervasive devices in mind, the biometric techniques proposed for identification should be robust and dynamic. In this work, we propose an emerging scheme that is based on few exclusive human traits and characte...

  15. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  16. Massive calculations of electrostatic potentials and structure maps of biopolymers in a distributed computing environment

    International Nuclear Information System (INIS)

    Akishina, T.P.; Ivanov, V.V.; Stepanenko, V.A.

    2013-01-01

    Among the key factors determining the processes of transcription and translation are the distributions of the electrostatic potentials of DNA, RNA and proteins. Calculations of electrostatic distributions and structure maps of biopolymers on computers are time consuming and require large computational resources. We developed the procedures for organization of massive calculations of electrostatic potentials and structure maps for biopolymers in a distributed computing environment (several thousands of cores).

  17. Vanderbilt University: Campus Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    Despite the decentralized nature of computing at Vanderbilt, there is significant evidence of cooperation and use of each other's resources by the various computing entities. Planning for computing occurs in every school and department. Caravan, a campus-wide network, is described. (MLW)

  18. Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment

    Science.gov (United States)

    Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.

    2013-12-01

    Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a

  19. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  20. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  1. Reaction time for trimolecular reactions in compartment-based reaction-diffusion models

    Science.gov (United States)

    Li, Fei; Chen, Minghan; Erban, Radek; Cao, Yang

    2018-05-01

    Trimolecular reaction models are investigated in the compartment-based (lattice-based) framework for stochastic reaction-diffusion modeling. The formulae for the first collision time and the mean reaction time are derived for the case where three molecules are present in the solution under periodic boundary conditions. For the case of reflecting boundary conditions, similar formulae are obtained using a computer-assisted approach. The accuracy of these formulae is further verified through comparison with numerical results. The presented derivation is based on the first passage time analysis of Montroll [J. Math. Phys. 10, 753 (1969)]. Montroll's results for two-dimensional lattice-based random walks are adapted and applied to compartment-based models of trimolecular reactions, which are studied in one-dimensional or pseudo one-dimensional domains.

  2. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  3. Thermonuclear Reaction Rate Libraries and Software Tools for Nuclear Astrophysics Research

    International Nuclear Information System (INIS)

    Smith, Michael S.; Cyburt, Richard; Schatz, Hendrik; Smith, Karl; Warren, Scott; Ferguson, Ryan; Wiescher, Michael; Lingerfelt, Eric; Buckner, Kim; Nesaraja, Caroline D.

    2008-01-01

    Thermonuclear reaction rates are a crucial input for simulating a wide variety of astrophysical environments. A new collaboration has been formed to ensure that astrophysical modelers have access to reaction rates based on the most recent experimental and theoretical nuclear physics information. To reach this goal, a new version of the REACLIB library has been created by the Joint Institute for Nuclear Astrophysics (JINA), now available online at http://www.nscl.msu.edu/~nero/db. A complementary effort is the development of software tools in the Computational Infrastructure for Nuclear Astrophysics, online at nucastrodata.org, to streamline, manage, and access the workflow of the reaction evaluations from their initiation to peer review to incorporation into the library. Details of these new projects will be described

  4. Extension of a Kinetic-Theory Approach for Computing Chemical-Reaction Rates to Reactions with Charged Particles

    Science.gov (United States)

    Liechty, Derek S.; Lewis, Mark J.

    2010-01-01

    Recently introduced molecular-level chemistry models that predict equilibrium and nonequilibrium reaction rates using only kinetic theory and fundamental molecular properties (i.e., no macroscopic reaction rate information) are extended to include reactions involving charged particles and electronic energy levels. The proposed extensions include ionization reactions, exothermic associative ionization reactions, endothermic and exothermic charge exchange reactions, and other exchange reactions involving ionized species. The extensions are shown to agree favorably with the measured Arrhenius rates for near-equilibrium conditions.

  5. Reaction Norms in Natural Conditions: How Does Metabolic Performance Respond to Weather Variations in a Small Endotherm Facing Cold Environments?

    Science.gov (United States)

    Petit, Magali; Vézina, François

    2014-01-01

    Reaction norms reflect an organisms' capacity to adjust its phenotype to the environment and allows for identifying trait values associated with physiological limits. However, reaction norms of physiological parameters are mostly unknown for endotherms living in natural conditions. Black-capped chickadees (Poecile atricapillus) increase their metabolic performance during winter acclimatization and are thus good model to measure reaction norms in the wild. We repeatedly measured basal (BMR) and summit (Msum) metabolism in chickadees to characterize, for the first time in a free-living endotherm, reaction norms of these parameters across the natural range of weather variation. BMR varied between individuals and was weakly and negatively related to minimal temperature. Msum varied with minimal temperature following a Z-shape curve, increasing linearly between 24°C and −10°C, and changed with absolute humidity following a U-shape relationship. These results suggest that thermal exchanges with the environment have minimal effects on maintenance costs, which may be individual-dependent, while thermogenic capacity is responding to body heat loss. Our results suggest also that BMR and Msum respond to different and likely independent constraints. PMID:25426860

  6. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  7. Development and implementation of a critical pathway for prevention of adverse reactions to contrast media for computed tomography

    International Nuclear Information System (INIS)

    Jang, Keun Jo; Kweon, Dae Cheol; Kim, Myeong Goo; Yoo, Beong Gyu

    2007-01-01

    The purpose of this study is to develop a critical pathway (CP) for the prevention of adverse reactions to contrast media for computed tomography. The CP was developed and implemented by a multidisciplinary group is Seoul National University Hospital. The CP was applied to CT patients. Patients who underwent CT scanning were included in the CP group from March in 2004. The satisfaction of the patients with CP was compared with non-CP groups. We also investigated the degree of satisfaction among the radiological technologists and nurses. The degree of patient satisfaction with the care process increased patient information (24%), prevention of adverse reactions to contrast media (19%), pre-cognitive effect of adverse reactions to contrast media (39%) and information degree of adverse reactions to contrast media (19%). This CP program can be used as one of the patient care tools for reducing the adverse reactions to contrast media and increasing the efficiency of care process in CT examination settings

  8. Development and implementation of a critical pathway for prevention of adverse reactions to contrast media for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Keun Jo [Presbyterian Medical Center, Seoul (Korea, Republic of); Kweon, Dae Cheol; Kim, Myeong Goo [Seoul National University Hospital, Seoul (Korea, Republic of); Yoo, Beong Gyu [Wonkwang Health Science College, Iksan (Korea, Republic of)

    2007-03-15

    The purpose of this study is to develop a critical pathway (CP) for the prevention of adverse reactions to contrast media for computed tomography. The CP was developed and implemented by a multidisciplinary group is Seoul National University Hospital. The CP was applied to CT patients. Patients who underwent CT scanning were included in the CP group from March in 2004. The satisfaction of the patients with CP was compared with non-CP groups. We also investigated the degree of satisfaction among the radiological technologists and nurses. The degree of patient satisfaction with the care process increased patient information (24%), prevention of adverse reactions to contrast media (19%), pre-cognitive effect of adverse reactions to contrast media (39%) and information degree of adverse reactions to contrast media (19%). This CP program can be used as one of the patient care tools for reducing the adverse reactions to contrast media and increasing the efficiency of care process in CT examination settings.

  9. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  10. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  11. Reaction energetics on long-range corrected density functional theory: Diels-Alder reactions.

    Science.gov (United States)

    Singh, Raman K; Tsuneda, Takao

    2013-02-15

    The possibility of quantitative reaction analysis on the orbital energies of long-range corrected density functional theory (LC-DFT) is presented. First, we calculated the Diels-Alder reaction enthalpies that have been poorly given by conventional functionals including B3LYP functional. As a result, it is found that the long-range correction drastically improves the reaction enthalpies. The barrier height energies were also computed for these reactions. Consequently, we found that dispersion correlation correction is also crucial to give accurate barrier height energies. It is, therefore, concluded that both long-range exchange interactions and dispersion correlations are essentially required in conventional functionals to investigate Diels-Alder reactions quantitatively. After confirming that LC-DFT accurately reproduces the orbital energies of the reactant and product molecules of the Diels-Alder reactions, the global hardness responses, the halves of highest occupied molecular orbital (HOMO)-lowest unoccupied molecular orbital (LUMO) energy gaps, along the intrinsic reaction coordinates of two Diels-Alder reactions were computed. We noticed that LC-DFT results satisfy the maximum hardness rule for overall reaction paths while conventional functionals violate this rule on the reaction pathways. Furthermore, our results also show that the HOMO-LUMO gap variations are close to the reaction enthalpies for these Diels-Alder reactions. Based on these results, we foresee quantitative reaction analysis on the orbital energies. Copyright © 2012 Wiley Periodicals, Inc.

  12. The Plant-Window System: A framework for an integrated computing environment at advanced nuclear power plants

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1997-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The extensive use of computer technology in advanced reactor designs provides the opportunity to greatly expand the capability to obtain, analyze, and present data about the plant to station personnel. However, to support highly efficient and increasingly safe operation of nuclear power plants, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and sued, to the proper users throughout the plan. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed On Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications within a common computing environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces so as to define a flexible computing environment for both current generation nuclear power plants and advanced reactor designs

  13. A Drawing and Multi-Representational Computer Environment for Beginners' Learning of Programming Using C: Design and Pilot Formative Evaluation

    Science.gov (United States)

    Kordaki, Maria

    2010-01-01

    This paper presents both the design and the pilot formative evaluation study of a computer-based problem-solving environment (named LECGO: Learning Environment for programming using C using Geometrical Objects) for the learning of computer programming using C by beginners. In its design, constructivist and social learning theories were taken into…

  14. Integration of Computational and Preparative Techniques to Demonstrate Physical Organic Concepts in Synthetic Organic Chemistry: An Example Using Diels-Alder Reaction

    Science.gov (United States)

    Palmer, David R. J.

    2004-01-01

    The Diels-Alder reaction is used as an example for showing the integration of computational and preparative techniques, which help in demonstrating the physical organic concepts in synthetic organic chemistry. These experiments show that the students should not accept the computational results without questioning them and in many Diels-Alder…

  15. Using high performance interconnects in a distributed computing and mass storage environment

    International Nuclear Information System (INIS)

    Ernst, M.

    1994-01-01

    Detector Collaborations of the HERA Experiments typically involve more than 500 physicists from a few dozen institutes. These physicists require access to large amounts of data in a fully transparent manner. Important issues include Distributed Mass Storage Management Systems in a Distributed and Heterogeneous Computing Environment. At the very center of a distributed system, including tens of CPUs and network attached mass storage peripherals are the communication links. Today scientists are witnessing an integration of computing and communication technology with the open-quote network close-quote becoming the computer. This contribution reports on a centrally operated computing facility for the HERA Experiments at DESY, including Symmetric Multiprocessor Machines (84 Processors), presently more than 400 GByte of magnetic disk and 40 TB of automoted tape storage, tied together by a HIPPI open-quote network close-quote. Focussing on the High Performance Interconnect technology, details will be provided about the HIPPI based open-quote Backplane close-quote configured around a 20 Gigabit/s Multi Media Router and the performance and efficiency of the related computer interfaces

  16. Synthesis of antimicrobial silver nanoparticles through a photomediated reaction in an aqueous environment

    Directory of Open Access Journals (Sweden)

    Banasiuk R

    2016-01-01

    Full Text Available Rafał Banasiuk,1,* Joanna E Frackowiak,2,* Marta Krychowiak,1 Marta Matuszewska,1 Anna Kawiak,1 Magdalena Ziabka,3 Zofia Lendzion-Bielun,4 Magdalena Narajczyk,5 Aleksandra Krolicka1 1Department of Biotechnology, Intercollegiate Faculty of Biotechnology, University of Gdansk and Medical University of Gdansk, 2Department of Pathophysiology, Medical University of Gdansk, Gdansk, 3Faculty of Materials Science and Ceramics, Department of Ceramics and Refractories, AGH-University of Science and Technology, Kraków, 4Institute of Chemical and Environment Engineering, West Pomeranian University of Technology, Szczecin, 5Faculty of Biology, Laboratory of Electron Microscopy, University of Gdansk, Gdansk, Poland *These authors contributed equally to this work Abstract: A fast, economical, and reproducible method for nanoparticle synthesis has been developed in our laboratory. The reaction is performed in an aqueous environment and utilizes light emitted by commercially available 1 W light-emitting diodes (λ =420 nm as the catalyst. This method does not require nanoparticle seeds or toxic chemicals. The irradiation process is carried out for a period of up to 10 minutes, significantly reducing the time required for synthesis as well as environmental impact. By modulating various reaction parameters silver nanoparticles were obtained, which were predominantly either spherical or cubic. The produced nanoparticles demonstrated strong antimicrobial activity toward the examined bacterial strains. Additionally, testing the effect of silver nanoparticles on the human keratinocyte cell line and human peripheral blood mononuclear cells revealed that their cytotoxicity may be limited by modulating the employed concentrations of nanoparticles. Keywords: antimicrobial activity, green synthesis, nanocubes, nanospheres 

  17. Development of IAEA nuclear reaction databases and services

    Energy Technology Data Exchange (ETDEWEB)

    Zerkin, V.; Trkov, A. [International Atomic Energy Agency, Dept. of Nuclear Sciences and Applications, Vienna (Austria)

    2008-07-01

    From mid-2004 onwards, the major nuclear reaction databases (EXFOR, CINDA and Endf) and services (Web and CD-Roms retrieval systems and specialized applications) have been functioning within a modern computing environment as multi-platform software, working under several operating systems with relational databases. Subsequent work at the IAEA has focused on three areas of development: revision and extension of the contents of the databases; extension and improvement of the functionality and integrity of the retrieval systems; development of software for database maintenance and system deployment. (authors)

  18. Golden rule kinetics of transfer reactions in condensed phase: The microscopic model of electron transfer reactions in disordered solid matrices

    Energy Technology Data Exchange (ETDEWEB)

    Basilevsky, M. V.; Mitina, E. A. [Photochemistry Center, Russian Academy of Sciences, 7a, Novatorov ul., Moscow (Russian Federation); Odinokov, A. V. [Photochemistry Center, Russian Academy of Sciences, 7a, Novatorov ul., Moscow (Russian Federation); National Research Nuclear University “MEPhI,” 31, Kashirskoye shosse, Moscow (Russian Federation); Titov, S. V. [Karpov Institute of Physical Chemistry, 3-1/12, Building 6, Obuha pereulok, Moscow (Russian Federation)

    2013-12-21

    The algorithm for a theoretical calculation of transfer reaction rates for light quantum particles (i.e., the electron and H-atom transfers) in non-polar solid matrices is formulated and justified. The mechanism postulated involves a local mode (an either intra- or inter-molecular one) serving as a mediator which accomplishes the energy exchange between the reacting high-frequency quantum mode and the phonon modes belonging to the environment. This approach uses as a background the Fermi golden rule beyond the usually applied spin-boson approximation. The dynamical treatment rests on the one-dimensional version of the standard quantum relaxation equation for the reduced density matrix, which describes the frequency fluctuation spectrum for the local mode under consideration. The temperature dependence of a reaction rate is controlled by the dimensionless parameter ξ{sub 0}=ℏω{sub 0}/k{sub B}T where ω{sub 0} is the frequency of the local mode and T is the temperature. The realization of the computational scheme is different for the high/intermediate (ξ{sub 0} < 1 − 3) and for low (ξ{sub 0}≫ 1) temperature ranges. For the first (quasi-classical) kinetic regime, the Redfield approximation to the solution of the relaxation equation proved to be sufficient and efficient in practical applications. The study of the essentially quantum-mechanical low-temperature kinetic regime in its asymptotic limit requires the implementation of the exact relaxation equation. The coherent mechanism providing a non-vanishing reaction rate has been revealed when T→ 0. An accurate computational methodology for the cross-over kinetic regime needs a further elaboration. The original model of the hopping mechanism for electronic conduction in photosensitive organic materials is considered, based on the above techniques. The electron transfer (ET) in active centers of such systems proceeds via local intra- and intermolecular modes. The active modes, as a rule, operate beyond the

  19. A web-based, collaborative modeling, simulation, and parallel computing environment for electromechanical systems

    Directory of Open Access Journals (Sweden)

    Xiaoliang Yin

    2015-03-01

    Full Text Available Complex electromechanical system is usually composed of multiple components from different domains, including mechanical, electronic, hydraulic, control, and so on. Modeling and simulation for electromechanical system on a unified platform is one of the research hotspots in system engineering at present. It is also the development trend of the design for complex electromechanical system. The unified modeling techniques and tools based on Modelica language provide a satisfactory solution. To meet with the requirements of collaborative modeling, simulation, and parallel computing for complex electromechanical systems based on Modelica, a general web-based modeling and simulation prototype environment, namely, WebMWorks, is designed and implemented. Based on the rich Internet application technologies, an interactive graphic user interface for modeling and post-processing on web browser was implemented; with the collaborative design module, the environment supports top-down, concurrent modeling and team cooperation; additionally, service-oriented architecture–based architecture was applied to supply compiling and solving services which run on cloud-like servers, so the environment can manage and dispatch large-scale simulation tasks in parallel on multiple computing servers simultaneously. An engineering application about pure electric vehicle is tested on WebMWorks. The results of simulation and parametric experiment demonstrate that the tested web-based environment can effectively shorten the design cycle of the complex electromechanical system.

  20. Secondary School Students' Attitudes towards Mathematics Computer--Assisted Instruction Environment in Kenya

    Science.gov (United States)

    Mwei, Philip K.; Wando, Dave; Too, Jackson K.

    2012-01-01

    This paper reports the results of research conducted in six classes (Form IV) with 205 students with a sample of 94 respondents. Data represent students' statements that describe (a) the role of Mathematics teachers in a computer-assisted instruction (CAI) environment and (b) effectiveness of CAI in Mathematics instruction. The results indicated…

  1. The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment

    Science.gov (United States)

    Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz , Sarah Jayne

    2013-01-01

    The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text…

  2. Analysis of insulation material deterioration under the LOCA simulated environment on the basis of reaction kinetics

    Energy Technology Data Exchange (ETDEWEB)

    Okada, Sohei; Kusama, Yasuo; Ito, Masayuki; Yagi, Toshiaki; Yoshikawa, Masato (Japan Atomic Energy Research Inst., Takasaki, Gunma. Takasaki Radiation Chemistry Research Establishment)

    1982-12-01

    In the type test of the electric cables installed in reactor containment vessels, it is considerably difficult to perform the testing over a year once in a while to simulate the accidental environment containing radiation and high temperature steam. Two requirements which seem to be more realistic as compared with the above mentioned testing method are inconsistent with each other. To solve this problem, a general rule of deterioration or the expression by an equation is necessary, which enables the extrapolation to show that a short term testing stands on the safety side. The authors have tried to numerically analyze the change of mechanical characteristics of ethylene-propylene rubber (EPR) and Hypalon which are, important as the materials for PH cables (fire-retardant, EP rubber-insulated, chlorosulfonated polyethylene-sheathed cable), in a complex environment of radiation, steam and chemical spray simulating PWR LOCA conditions. In this report, a method is proposed to analyze and estimate the properties by the regression analysis technique on the basis of reaction kinetics, and the analyzed results are described in the order of experiment, analysis method and the results and consideration. The deterioration of the elongation P = e/esub(o) of EPR and Hypalon in the above described complex environment can be represented by the equation - dP/dt = KPsup(n). The exponent n varied in the cases when air is contained or not in that environment, suggesting that the different reactions are dominant in both conditions, respectively. For EPR, n was close to 2 if air was not contained and close to 1 if air was contained in the system.

  3. Nuclides.net: A computational environment for nuclear data and applications in radioprotection and radioecology

    International Nuclear Information System (INIS)

    Berthou, V.; Galy, J.; Leutzenkirchen, K.

    2004-01-01

    An interactive multimedia tool, Nuclides.net, has been developed at the Institute for Transuranium Elements. The Nuclides.net 'integrated environment' is a suite of computer programs ranging from a powerful user-friendly interface, which allows the user to navigate the nuclides chart and explore the properties of nuclides, to various computational modules for decay calculations, dosimetry and shielding calculations, etc. The product is particularly suitable for environmental radioprotection and radioecology. (authors)

  4. Noncanonical Reactions of Flavoenzymes

    Directory of Open Access Journals (Sweden)

    Pablo Sobrado

    2012-11-01

    Full Text Available Enzymes containing flavin cofactors are predominantly involved in redox reactions in numerous cellular processes where the protein environment modulates the chemical reactivity of the flavin to either transfer one or two electrons. Some flavoenzymes catalyze reactions with no net redox change. In these reactions, the protein environment modulates the reactivity of the flavin to perform novel chemistries. Recent mechanistic and structural data supporting novel flavin functionalities in reactions catalyzed by chorismate synthase, type II isopentenyl diphosphate isomerase, UDP-galactopyranose mutase, and alkyl-dihydroxyacetonephosphate synthase are presented in this review. In these enzymes, the flavin plays either a direct role in acid/base reactions or as a nucleophile or electrophile. In addition, the flavin cofactor is proposed to function as a “molecular scaffold” in the formation of UDP-galactofuranose and alkyl-dihydroxyacetonephosphate by forming a covalent adduct with reaction intermediates.

  5. CLINICAL SURFACES - Activity-Based Computing for Distributed Multi-Display Environments in Hospitals

    Science.gov (United States)

    Bardram, Jakob E.; Bunde-Pedersen, Jonathan; Doryab, Afsaneh; Sørensen, Steffen

    A multi-display environment (MDE) is made up of co-located and networked personal and public devices that form an integrated workspace enabling co-located group work. Traditionally, MDEs have, however, mainly been designed to support a single “smart room”, and have had little sense of the tasks and activities that the MDE is being used for. This paper presents a novel approach to support activity-based computing in distributed MDEs, where displays are physically distributed across a large building. CLINICAL SURFACES was designed for clinical work in hospitals, and enables context-sensitive retrieval and browsing of patient data on public displays. We present the design and implementation of CLINICAL SURFACES, and report from an evaluation of the system at a large hospital. The evaluation shows that using distributed public displays to support activity-based computing inside a hospital is very useful for clinical work, and that the apparent contradiction between maintaining privacy of medical data in a public display environment can be mitigated by the use of CLINICAL SURFACES.

  6. The Integrated Computational Environment for Airbreathing Hypersonic Flight Vehicle Modeling and Design Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An integrated computational environment for multidisciplinary, physics-based simulation and analyses of airbreathing hypersonic flight vehicles will be developed....

  7. Experimental technique for study on three-particle reactions in kinematically total experiments with usage of the two-processor complex on the M-400 computer basis

    International Nuclear Information System (INIS)

    Berezin, F.N.; Kisurin, V.A.; Nemets, O.F.; Ofengenden, R.G.; Pugach, V.M.; Pavlenko, Yu.N.; Patlan', Yu.V.; Savrasov, S.S.

    1981-01-01

    Experimental technique for investigation of three-particle nuclear reactions in kinematically total experiments is described. The technique provides the storage of one-dimensional and two- dimensional energy spectra from several detectors. A block diagram of the measuring system, using this technique, is presented. The measuring system consists of analog equipment for rapid-slow coincidences and of a two-processor complex on the base of the M-400 computer with a general bus. Application of a two-processor complex, each computer of which has a possibility of direct access to memory of another computer, permits to separate functions of data collection and data operational presentation and to perform necessary physical calculations. Software of the measuring complex which includes programs written using the ASSEMBLER language for the first computer and functional programs written using the BASIC language for the second computer, is considered. Software of the first computer includes the DISPETCHER dialog control program, driver package for control of external devices, of applied program package and system modules. The technique, described, is tested in experiment on investigation of d+ 10 B→α+α+α three- particle reaction at deutron energy of 13.6 MeV. The two-dimensional energy spectrum reaction obtained with the help of the technique described is presented [ru

  8. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    Science.gov (United States)

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  9. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    Science.gov (United States)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  10. Reaction Decoder Tool (RDT): extracting features from chemical reactions.

    Science.gov (United States)

    Rahman, Syed Asad; Torrance, Gilliean; Baldacci, Lorenzo; Martínez Cuesta, Sergio; Fenninger, Franz; Gopal, Nimish; Choudhary, Saket; May, John W; Holliday, Gemma L; Steinbeck, Christoph; Thornton, Janet M

    2016-07-01

    Extracting chemical features like Atom-Atom Mapping (AAM), Bond Changes (BCs) and Reaction Centres from biochemical reactions helps us understand the chemical composition of enzymatic reactions. Reaction Decoder is a robust command line tool, which performs this task with high accuracy. It supports standard chemical input/output exchange formats i.e. RXN/SMILES, computes AAM, highlights BCs and creates images of the mapped reaction. This aids in the analysis of metabolic pathways and the ability to perform comparative studies of chemical reactions based on these features. This software is implemented in Java, supported on Windows, Linux and Mac OSX, and freely available at https://github.com/asad/ReactionDecoder : asad@ebi.ac.uk or s9asad@gmail.com. © The Author 2016. Published by Oxford University Press.

  11. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  12. Specialized Computer Systems for Environment Visualization

    Science.gov (United States)

    Al-Oraiqat, Anas M.; Bashkov, Evgeniy A.; Zori, Sergii A.

    2018-06-01

    The need for real time image generation of landscapes arises in various fields as part of tasks solved by virtual and augmented reality systems, as well as geographic information systems. Such systems provide opportunities for collecting, storing, analyzing and graphically visualizing geographic data. Algorithmic and hardware software tools for increasing the realism and efficiency of the environment visualization in 3D visualization systems are proposed. This paper discusses a modified path tracing algorithm with a two-level hierarchy of bounding volumes and finding intersections with Axis-Aligned Bounding Box. The proposed algorithm eliminates the branching and hence makes the algorithm more suitable to be implemented on the multi-threaded CPU and GPU. A modified ROAM algorithm is used to solve the qualitative visualization of reliefs' problems and landscapes. The algorithm is implemented on parallel systems—cluster and Compute Unified Device Architecture-networks. Results show that the implementation on MPI clusters is more efficient than Graphics Processing Unit/Graphics Processing Clusters and allows real-time synthesis. The organization and algorithms of the parallel GPU system for the 3D pseudo stereo image/video synthesis are proposed. With realizing possibility analysis on a parallel GPU-architecture of each stage, 3D pseudo stereo synthesis is performed. An experimental prototype of a specialized hardware-software system 3D pseudo stereo imaging and video was developed on the CPU/GPU. The experimental results show that the proposed adaptation of 3D pseudo stereo imaging to the architecture of GPU-systems is efficient. Also it accelerates the computational procedures of 3D pseudo-stereo synthesis for the anaglyph and anamorphic formats of the 3D stereo frame without performing optimization procedures. The acceleration is on average 11 and 54 times for test GPUs.

  13. EQ6, a computer program for reaction path modeling of aqueous geochemical systems: Theoretical manual, user`s guide, and related documentation (Version 7.0); Part 4

    Energy Technology Data Exchange (ETDEWEB)

    Wolery, T.J.; Daveler, S.A.

    1992-10-09

    EQ6 is a FORTRAN computer program in the EQ3/6 software package (Wolery, 1979). It calculates reaction paths (chemical evolution) in reacting water-rock and water-rock-waste systems. Speciation in aqueous solution is an integral part of these calculations. EQ6 computes models of titration processes (including fluid mixing), irreversible reaction in closed systems, irreversible reaction in some simple kinds of open systems, and heating or cooling processes, as well as solve ``single-point`` thermodynamic equilibrium problems. A reaction path calculation normally involves a sequence of thermodynamic equilibrium calculations. Chemical evolution is driven by a set of irreversible reactions (i.e., reactions out of equilibrium) and/or changes in temperature and/or pressure. These irreversible reactions usually represent the dissolution or precipitation of minerals or other solids. The code computes the appearance and disappearance of phases in solubility equilibrium with the water. It finds the identities of these phases automatically. The user may specify which potential phases are allowed to form and which are not. There is an option to fix the fugacities of specified gas species, simulating contact with a large external reservoir. Rate laws for irreversible reactions may be either relative rates or actual rates. If any actual rates are used, the calculation has a time frame. Several forms for actual rate laws are programmed into the code. EQ6 is presently able to model both mineral dissolution and growth kinetics.

  14. Efficient Computational Research Protocol to Survey Free Energy Surface for Solution Chemical Reaction in the QM/MM Framework: The FEG-ER Methodology and Its Application to Isomerization Reaction of Glycine in Aqueous Solution.

    Science.gov (United States)

    Takenaka, Norio; Kitamura, Yukichi; Nagaoka, Masataka

    2016-03-03

    In solution chemical reaction, we often need to consider a multidimensional free energy (FE) surface (FES) which is analogous to a Born-Oppenheimer potential energy surface. To survey the FES, an efficient computational research protocol is proposed within the QM/MM framework; (i) we first obtain some stable states (or transition states) involved by optimizing their structures on the FES, in a stepwise fashion, finally using the free energy gradient (FEG) method, and then (ii) we directly obtain the FE differences among any arbitrary states on the FES, efficiently by employing the QM/MM method with energy representation (ER), i.e., the QM/MM-ER method. To validate the calculation accuracy and efficiency, we applied the above FEG-ER methodology to a typical isomerization reaction of glycine in aqueous solution, and reproduced quite satisfactorily the experimental value of the reaction FE. Further, it was found that the structural relaxation of the solute in the QM/MM force field is not negligible to estimate correctly the FES. We believe that the present research protocol should become prevailing as one computational strategy and will play promising and important roles in solution chemistry toward solution reaction ergodography.

  15. Attitudes and gender differences of high school seniors within one-to-one computing environments in South Dakota

    Science.gov (United States)

    Nelson, Mathew

    In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.

  16. Calculation of reaction energies and adiabatic temperatures for waste tank reactions

    International Nuclear Information System (INIS)

    Burger, L.L.

    1993-03-01

    Continual concern has been expressed over potentially hazardous exothermic reactions that might occur in underground Hanford waste tanks. These tanks contain many different oxidizable compounds covering a wide range of concentrations. Several may be in concentrations and quantities great enough to be considered a hazard in that they could undergo rapid and energetic chemical reactions with nitrate and nitrite salts that are present. The tanks also contain many inorganic compounds inert to oxidation. In this report the computed energy that may be released when various organic and inorganic compounds react is computed as a function of the reaction mix composition and the temperature. The enthalpy, or integrated heat capacity, of these compounds and various reaction products is presented as a function of temperature, and the enthalpy of a given mixture can then be equated to the energy release from various reactions to predict the maximum temperature that may be reached. This is estimated for several different compositions. Alternatively, the amounts of various diluents required to prevent the temperature from reaching a critical value can be estimated

  17. Computing the cross sections of nuclear reactions with nuclear clusters emission for proton energies between 30 MeV and 2.6 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Korovin, Yu. A.; Maksimushkina, A. V., E-mail: AVMaksimushkina@mephi.ru; Frolova, T. A. [Obninsk Institute for Nuclear Power Engineering, National Research Nuclear University MEPhI (Moscow Engineering Physics Institute) (Russian Federation)

    2016-12-15

    The cross sections of nuclear reactions involving emission of clusters of light nuclei in proton collisions with a heavy-metal target are computed for incident-proton energies between 30 MeV and 2.6 GeV. The calculation relies on the ALICE/ASH and CASCADE/INPE computer codes. The parameters determining the pre-equilibrium cluster emission are varied in the computation.

  18. Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis

    NARCIS (Netherlands)

    van der Kleij, Fabienne; Feskens, Remco C.W.; Eggen, Theodorus Johannes Hendrikus Maria

    2015-01-01

    In this meta-analysis, we investigated the effects of methods for providing item-based feedback in a computer-based environment on students’ learning outcomes. From 40 studies, 70 effect sizes were computed, which ranged from −0.78 to 2.29. A mixed model was used for the data analysis. The results

  19. ReputationPro: The Efficient Approaches to Contextual Transaction Trust Computation in E-Commerce Environments

    OpenAIRE

    Zhang, Haibin; Wang, Yan; Zhang, Xiuzhen; Lim, Ee-Peng

    2013-01-01

    In e-commerce environments, the trustworthiness of a seller is utterly important to potential buyers, especially when the seller is unknown to them. Most existing trust evaluation models compute a single value to reflect the general trust level of a seller without taking any transaction context information into account. In this paper, we first present a trust vector consisting of three values for Contextual Transaction Trust (CTT). In the computation of three CTT values, the identified three ...

  20. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    Science.gov (United States)

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  1. Introduction to chemical reaction engineering

    International Nuclear Information System (INIS)

    Kim, Yeong Geol

    1990-10-01

    This deals with chemical reaction engineering with thirteen chapters. The contents of this book are introduction on reaction engineering, chemical kinetics, thermodynamics and chemical reaction, abnormal reactor, non-isothermal reactor, nonideal reactor, catalysis in nonuniform system, diffusion and reaction in porosity catalyst, design catalyst heterogeneous reactor in solid bed, a high molecule polymerization, bio reaction engineering, reaction engineering in material process, control multi-variable reactor process using digital computer.

  2. Reactions of cisplatin with cysteine and methionine at constant pH; a computational study.

    Science.gov (United States)

    Zimmermann, Tomás; Burda, Jaroslav V

    2010-02-07

    Interactions of hydrated cisplatin complexes cis-[Pt(NH(3))(2)Cl(H(2)O)](+) and cis-[Pt(NH(3))(2)(OH)(H(2)O)](+) with cysteine and methionine in an aqueous solution at constant pH were explored using computational methods. Thermodynamic parameters of considered reactions were studied in a broad pH range, taking up to 4 protonation states of each molecule into account. Reaction free energies at constant pH were obtained from standard Gibbs free energies using the Legendre transformation. Solvation free energies and pK(a) values were calculated using the PCM model with UAHF cavities, recently adapted by us for transition metal complexes. The root mean square error of pK(a) values on a set of model platinum complexes and amino acids was equal to 0.74. At pH 7, the transformed Gibbs free energies differ by up to 15 kcal mol(-1) from the Gibbs free energies of model reactions with a constant number of protons. As for cysteine, calculations confirmed a strong preference for kappaS monodenate bonding in a broad pH range. The most stable product of the second reaction step, which proceeds from monodentate to chelate complex, is the kappa(2)S,N coordinated chelate. The reaction with methionine is more complex. In the first step all three considered methionine donor atoms (N, S and O) are thermodynamically preferred products depending on the platinum complex and the pH. This is in accordance with the experimental observation of a pH dependent migration between N and S donor atoms in a chemically related system. The most stable chelates of platinum with methionine are kappa(2)S,N and kappa(2)N,O bonded complexes. The comparison of reaction free energies of both amino acids suggests, that the bidentate methionine ligand can be displaced even by the monodentate cysteine ligand under certain conditions.

  3. Scheduling Method of Data-Intensive Applications in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Xiong Fu

    2015-01-01

    Full Text Available The virtualization of cloud computing improves the utilization of resources and energy. And a cloud user can deploy his/her own applications and related data on a pay-as-you-go basis. The communications between an application and a data storage node, as well as within the application, have a great impact on the execution efficiency of the application. The locations of subtasks of an application and the data that transferred between the subtasks are the main reason why communication delay exists. The communication delay can affect the completion time of the application. In this paper, we take into account the data transmission time and communications between subtasks and propose a heuristic optimal virtual machine (VM placement algorithm. Related simulations demonstrate that this algorithm can reduce the completion time of user tasks and ensure the feasibility and effectiveness of the overall network performance of applications when running in a cloud computing environment.

  4. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  5. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  6. A Computer Model for Analyzing Volatile Removal Assembly

    Science.gov (United States)

    Guo, Boyun

    2010-01-01

    A computer model simulates reactional gas/liquid two-phase flow processes in porous media. A typical process is the oxygen/wastewater flow in the Volatile Removal Assembly (VRA) in the Closed Environment Life Support System (CELSS) installed in the International Space Station (ISS). The volatile organics in the wastewater are combusted by oxygen gas to form clean water and carbon dioxide, which is solved in the water phase. The model predicts the oxygen gas concentration profile in the reactor, which is an indicator of reactor performance. In this innovation, a mathematical model is included in the computer model for calculating the mass transfer from the gas phase to the liquid phase. The amount of mass transfer depends on several factors, including gas-phase concentration, distribution, and reaction rate. For a given reactor dimension, these factors depend on pressure and temperature in the reactor and composition and flow rate of the influent.

  7. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906

  8. Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Weizhe Zhang

    2013-01-01

    Full Text Available Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  9. Secure encapsulation and publication of biological services in the cloud computing environment.

    Science.gov (United States)

    Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon

    2013-01-01

    Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.

  10. Student Advising and Retention Application in Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2016-11-01

    Full Text Available  This paper proposes a new user-friendly application enhancing and expanding the current advising services of Gradesfirst currently being used for advising and retention by the Athletic department of UMES with a view to implement new performance activities like mentoring, tutoring, scheduling, and study hall hours into existing tools. This application includes various measurements that can be used to monitor and improve the performance of the students in the Athletic Department of UMES by monitoring students’ weekly study hall hours, and tutoring schedules. It also supervises tutors’ login and logout activities in order to monitor their effectiveness, supervises tutor-tutee interaction, and stores and analyzes the overall academic progress of each student. A dedicated server for providing services will be developed at the local site. The paper has been implemented in three steps. The first step involves the creation of an independent cloud computing environment that provides resources such as database creation, query-based statistical data, performance measures activities, and automated support of performance measures such as advising, mentoring, monitoring and tutoring. The second step involves the creation of an application known as Student Advising and Retention (SAR application in a cloud computing environment. This application has been designed to be a comprehensive database management system which contains relevant data regarding student academic development that supports various strategic advising and monitoring of students. The third step involves the creation of a systematic advising chart and frameworks which help advisors. The paper shows ways of creating the most appropriate advising technique based on the student’s academic needs. The proposed application runs in a Windows-based system. As stated above, the proposed application is expected to enhance and expand the current advising service of Gradesfirst tool. A brief

  11. Reaction path analysis of sodium-water reaction phenomena in support of chemical reaction model development

    International Nuclear Information System (INIS)

    Kikuchi, Shin; Ohshima, Hiroyuki; Hashimoto, Kenro

    2011-01-01

    Computational study of the sodium-water reaction at the gas (water) - liquid (sodium) interface has been carried out using ab initio (first-principle) method. A possible reaction channel has been identified for the stepwise OH bond dissociations of a single water molecule. The energetics including the binding energy of a water molecule to the sodium surface, the activation energies of the bond cleavages, and the reaction energies, have been evaluated, and the rate constants of the first and second OH bond-breakings have been compared. The results are used as the basis for constructing the chemical reaction model used in a multi-dimensional sodium-water reaction code, SERAPHIM, being developed by JAEA toward the safety assessment of the steam generator (SG) in a sodium-cooled fast reactor (SFR). (author)

  12. Modeling of chemical reactions in micelle: water-mediated keto-enol interconversion as a case study.

    Science.gov (United States)

    Marracino, Paolo; Amadei, Andrea; Apollonio, Francesca; d'Inzeo, Guglielmo; Liberti, Micaela; di Crescenzo, Antonello; Fontana, Antonella; Zappacosta, Romina; Aschi, Massimiliano

    2011-06-30

    The effect of a zwitterionic micelle environment on the efficiency of the keto-enol interconversion of 2-phenylacetylthiophene has been investigated by means of a joint application of experimental and theoretical/computational approaches. Results have revealed a reduction of the reaction rate constant if compared with bulk water essentially because of the different solvation conditions experienced by the reactant species, including water molecules, in the micelle environment. The slight inhibiting effect due to the application of a static electric field has also been theoretically investigated and presented.

  13. Ab initio computational study of –N-C and –O-C bonding formation : functional group modification reaction based chitosan

    Science.gov (United States)

    Siahaan, P.; Salimah, S. N. M.; Sipangkar, M. J.; Hudiyanti, D.; Djunaidi, M. C.; Laksitorini, M. D.

    2018-04-01

    Chitosan application in pharmaceutics and cosmeceutics industries is limited by its solubility issue. Modification of -NH2 and -OH fuctional groups of chitosan by adding carboxyl group has been shown to improve its solubility and application. Attempt to synthesize carboxymethyl chitosan (CMC) from monocloroacetic acid (MCAA) has been done prior this report. However no information is available wether –OH (-O-C bonding formation) or -NH2 (-N-C bonding formation) is the preference for - CH2COOH to attach. In the current study, the reaction mechanism between chitosan and MCAA reactants into carboxymethyl chitosan (CMC) was examined by computational approach. Dimer from of chitosan used as a molecular model in calculation All the molecular structure involved in the reaction mechanism was optimized by ab initio computational on the theory and basis set HF/6-31G(d,p). The results showed that the - N-C bonding formation via SN2 than the -O-C bonding formation via SN2 which have activation energy 469.437 kJ/mol and 533.219 kJ/mol respectively. However, the -O-C bonding formation more spontaneous than the -N-C bonding formation because ΔG the formation of O-CMC-2 reaction is more negative than ΔG of formation N-CMC-2 reaction is -4.353 kJ/mol and -1.095 kJ/mol respectively. The synthesis of N,O-CMC first forms -O-CH2COOH, then continues to form -NH-CH2COOH. This information is valuable to further optimize the reaction codition for CMC synthesis.

  14. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    Science.gov (United States)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  15. The Plant-Window system: A flexible, expandable computing environment for the integration of power plant activities

    International Nuclear Information System (INIS)

    Wood, R.T.; Mullens, J.A.; Naser, J.A.

    1994-01-01

    Power plant data, and the information that can be derived from it, provide the link to the plant through which the operations, maintenance and engineering staff understand and manage plant performance. The increasing use of computer technology in the US nuclear power industry has greatly expanded the capability to obtain, analyze, and present data about the plant to station personnel. However, it is necessary to transform the vast quantity of available data into clear, concise, and coherent information that can be readily accessed and used throughout the plant. This need can be met by an integrated computer workstation environment that provides the necessary information and software applications, in a manner that can be easily understood and used, to the proper users throughout the plant. As part of a Cooperative Research and Development Agreement with the Electric Power Research Institute, the Oak Ridge National Laboratory has developed functional requirements for a Plant-Wide Integrated Environment Distributed on Workstations (Plant-Window) System. The Plant-Window System (PWS) can serve the needs of operations, engineering, and maintenance personnel at nuclear power stations by providing integrated data and software applications (e.g., monitoring, analysis, diagnosis, and control applications) within a common environment. The PWS requirements identify functional capabilities and provide guidelines for standardized hardware, software, and display interfaces to define a flexible computer environment that permits a tailored implementation of workstation capabilities and facilitates future upgrades

  16. Improving science and mathematics education with computational modelling in interactive engagement environments

    Science.gov (United States)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  17. O ensino de reações orgânicas usando química computacional: I. reações de adição eletrofílica a alquenos Teaching organic reactions using computational chemistry: I. eletrophilic addition reactions to alkenes

    Directory of Open Access Journals (Sweden)

    Arquimedes Mariano

    2008-01-01

    Full Text Available Basic concepts that play an important role in some organic reactions are revisited in this paper, which reports a pedagogical experience involving undergraduate and graduate students. A systematic procedure has been applied in order to use widespread available computational tools. This paper aims to discuss the use of computers in teaching electrophilic addition reactions to alkenes. Two classical examples have been investigated: addition to non-conjugated alkenes and addition to conjugated dienes. The results were compared with those normally discussed in organic textbooks. Several important concepts, such as conformational analysis and energy control (kinetic and thermodynamic involved in reaction mechanisms can be taught more efficiently if one connects theoretical and practical tools.

  18. Investigations on an environment friendly chemical reaction process (eco-chemistry). 2; Kankyo ni yasashii kagaku hanno process (eko chemistry) ni kansuru chosa. 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    In order to structure a chemical reaction process that does not discharge a large amount of waste by-products or harmful chemical substances, or so-called environment friendly process, investigations and discussions were given based on the results derived in the previous fiscal year. A proposal was made to reduce environmental load on development of oxidized and dehydrogenated catalysts that can produce selectively ethylene, propylene and isobutylene in an oxidation process. In liquid phase oxidation, redox-based oxidation and solid catalyzation of automatic oxidation reaction were enumerated. In acid base catalyst reaction, development of ultra strong solid acid was described to structure no pollution discharging process. In the fine chemical and pharmaceutical fields, the optical active substance method and the position-selective aromatics displacement reaction were evaluated to reduce environmental load. A questionnaire survey performed on major chemical corporations inside and outside the country revealed the following processes as the ones that can cause hidden environmental problems: processes discharging large amount of wastes, processes treating dangerous materials, and processes consuming large amount of energy. Development of catalysts is important that can realize high yield, high selectivity and reactions under mild conditions as a future environment harmonizing chemical process. 117 refs., 23 figs., 22 tabs.

  19. Computational Analyses of Complex Flows with Chemical Reactions

    Science.gov (United States)

    Bae, Kang-Sik

    The heat and mass transfer phenomena in micro-scale for the mass transfer phenomena on drug in cylindrical matrix system, the simulation of oxygen/drug diffusion in a three dimensional capillary network, and a reduced chemical kinetic modeling of gas turbine combustion for Jet propellant-10 have been studied numerically. For the numerical analysis of the mass transfer phenomena on drug in cylindrical matrix system, the governing equations are derived from the cylindrical matrix systems, Krogh cylinder model, which modeling system is comprised of a capillary to a surrounding cylinder tissue along with the arterial distance to veins. ADI (Alternative Direction Implicit) scheme and Thomas algorithm are applied to solve the nonlinear partial differential equations (PDEs). This study shows that the important factors which have an effect on the drug penetration depth to the tissue are the mass diffusivity and the consumption of relevant species during the time allowed for diffusion to the brain tissue. Also, a computational fluid dynamics (CFD) model has been developed to simulate the blood flow and oxygen/drug diffusion in a three dimensional capillary network, which are satisfied in the physiological range of a typical capillary. A three dimensional geometry has been constructed to replicate the one studied by Secomb et al. (2000), and the computational framework features a non-Newtonian viscosity model for blood, the oxygen transport model including in oxygen-hemoglobin dissociation and wall flux due to tissue absorption, as well as an ability to study the diffusion of drugs and other materials in the capillary streams. Finally, a chemical kinetic mechanism of JP-10 has been compiled and validated for a wide range of combustion regimes, covering pressures of 1atm to 40atm with temperature ranges of 1,200 K--1,700 K, which is being studied as a possible Jet propellant for the Pulse Detonation Engine (PDE) and other high-speed flight applications such as hypersonic

  20. Encountering the Expertise Reversal Effect with a Computer-Based Environment on Electrical Circuit Analysis

    Science.gov (United States)

    Reisslein, Jana; Atkinson, Robert K.; Seeling, Patrick; Reisslein, Martin

    2006-01-01

    This study examined the effectiveness of a computer-based environment employing three example-based instructional procedures (example-problem, problem-example, and fading) to teach series and parallel electrical circuit analysis to learners classified by two levels of prior knowledge (low and high). Although no differences between the…

  1. Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments

    Directory of Open Access Journals (Sweden)

    Jose M. Moya

    2012-08-01

    Full Text Available Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  2. Ubiquitous green computing techniques for high demand applications in Smart environments.

    Science.gov (United States)

    Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L

    2012-01-01

    Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  3. Minimum Energy Pathways for Chemical Reactions

    Science.gov (United States)

    Walch, S. P.; Langhoff, S. R. (Technical Monitor)

    1995-01-01

    Computed potential energy surfaces are often required for computation of such parameters as rate constants as a function of temperature, product branching ratios, and other detailed properties. We have found that computation of the stationary points/reaction pathways using CASSCF/derivative methods, followed by use of the internally contracted CI method to obtain accurate energetics, gives useful results for a number of chemically important systems. The talk will focus on a number of applications to reactions leading to NOx and soot formation in hydrocarbon combustion.

  4. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    Energy Technology Data Exchange (ETDEWEB)

    Aderholdt, Ferrol [Tennessee Technological Univ., Cookeville, TN (United States); Caldwell, Blake A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hicks, Susan Elaine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Koch, Scott M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Naughton, III, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pelfrey, Daniel S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pogge, James R [Tennessee Technological Univ., Cookeville, TN (United States); Scott, Stephen L [Tennessee Technological Univ., Cookeville, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sorrillo, Lawrence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges for the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.

  5. Aeroflex Single Board Computers and Instrument Circuit Cards for Nuclear Environments Measuring and Monitoring

    International Nuclear Information System (INIS)

    Stratton, Sam; Stevenson, Dave; Magnifico, Mateo

    2013-06-01

    A Single Board Computer (SBC) is an entire computer including all of the required components and I/O interfaces built on a single circuit board. SBC's are used across numerous industrial, military and space flight applications. In the case of military and space implementations, SBC's employ advanced high reliability processors designed for rugged thermal, mechanical and even radiation environments. These processors, in turn, rely on equally advanced support components such as memory, interface, and digital logic. When all of these components are put together on a printed circuit card, the result is a highly reliable Single Board Computer that can perform a wide variety of tasks in very harsh environments. In the area of instrumentation, peripheral circuit cards can be developed that directly interface to the SBC and various radiation measuring devices and systems. Designers use signal conditioning and high reliability Analog to Digital Converters (ADC's) to convert the measuring device signals to digital data suitable for a microprocessor. The data can then be sent to the SBC via high speed communication protocols such as Ethernet or similar type of serial bus. Data received by the SBC can then be manipulated and processed into a form readily available to users. Recent events are causing some in the NPP industry to consider devices and systems with better radiation and temperature performance capability. Systems designed for space application are designed for the harsh environment of space which under certain conditions would be similar to what the electronics will see during a severe nuclear reactor event. The NPP industry should be considering higher reliability electronics for certain critical applications. (authors)

  6. Development of a computational environment for the General Curvilinear Ocean Model

    International Nuclear Information System (INIS)

    Thomas, Mary P; Castillo, Jose E

    2009-01-01

    The General Curvilinear Ocean Model (GCOM) differs significantly from the traditional approach, where the use of Cartesian coordinates forces the model to simulate terrain as a series of steps. GCOM utilizes a full three-dimensional curvilinear transformation, which has been shown to have greater accuracy than similar models and to achieve results more efficiently. The GCOM model has been validated for several types of water bodies, different coastlines and bottom shapes, including the Alarcon Seamount, Southern California Coastal Region, the Valencia Lake in Venezuela, and more recently the Monterey Bay. In this paper, enhancements to the GCOM model and an overview of the computational environment (GCOM-CE) are presented. Model improvements include migration from F77 to F90; approach to a component design; and initial steps towards parallelization of the model. Through the use of the component design, new models are being incorporated including biogeochemical, pollution, and sediment transport. The computational environment is designed to allow various client interactions via secure Web applications (portal, Web services, and Web 2.0 gadgets). Features include building jobs, managing and interacting with long running jobs; managing input and output files; quick visualization of results; publishing of Web services to be used by other systems such as larger climate models. The CE is based mainly on Python tools including a grid-enabled Pylons Web application Framework for Web services, pyWSRF (python-Web Services-Resource Framework), pyGlobus based web services, SciPy, and Google code tools.

  7. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Science.gov (United States)

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  8. Performance Evaluation of Resource Management in Cloud Computing Environments.

    Directory of Open Access Journals (Sweden)

    Bruno Guazzelli Batista

    Full Text Available Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  9. Study of Propagation Mechanisms in Dynamical Railway Environment to Reduce Computation Time of 3D Ray Tracing Simulator

    Directory of Open Access Journals (Sweden)

    Siham Hairoud

    2013-01-01

    Full Text Available In order to better assess the behaviours of the propagation channel in a confined environment such as a railway tunnel for subway application, we present an optimization method for a deterministic channel simulator based on 3D ray tracing associated to the geometrical optics laws and the uniform theory of diffraction. This tool requires a detailed description of the environment. Thus, the complexity of this model is directly bound to the complexity of the environment and specifically to the number of facets that compose it. In this paper, we propose an algorithm to identify facets that have no significant impact on the wave propagation. This allows us to simplify the description of the geometry of the modelled environment by removing them and by this way, to reduce the complexity of our model and therefore its computation time. A comparative study between full and simplified environment is led and shows the impact of this proposed method on the characteristic parameters of the propagation channel. Thus computation time obtained from the simplified environment is 6 times lower than the one of the full model without significant degradation of simulation accuracy.

  10. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    Science.gov (United States)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  11. Computer modeling of the dynamics of surface tension on rotating fluids in low and microgravity environments

    Science.gov (United States)

    Hung, R. J.; Tsao, Y. D.; Hong, B. B.; Leslie, Fred W.

    1989-01-01

    Time-dependent evolutions of the profile of the free surface (bubble shapes) for a cylindrical container partially filled with a Newtonian fluid of constant density, rotating about its axis of symmetry, have been studied. Numerical computations have been carried out with the following situations: (1) linear functions of spin-up and spin-down in low- and microgravity environments, (2) linear functions of increasing and decreasing gravity environments at high- and low-rotating cylinder speeds, and (3) step functions of spin-up and spin-down in a low-gravity environment.

  12. Improving Quality of Service and Reducing Power Consumption with WAN accelerator in Cloud Computing Environments

    OpenAIRE

    Shin-ichi Kuribayashi

    2013-01-01

    The widespread use of cloud computing services is expected to deteriorate a Quality of Service and toincrease the power consumption of ICT devices, since the distance to a server becomes longer than before. Migration of virtual machines over a wide area can solve many problems such as load balancing and power saving in cloud computing environments. This paper proposes to dynamically apply WAN accelerator within the network when a virtual machine is moved to a distant center, in order to preve...

  13. HEPLIB '91: International users meeting on the support and environments of high energy physics computing

    International Nuclear Information System (INIS)

    Johnstad, H.

    1991-01-01

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, data base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards

  14. Hybrid Quantum Mechanics/Molecular Mechanics Solvation Scheme for Computing Free Energies of Reactions at Metal-Water Interfaces.

    Science.gov (United States)

    Faheem, Muhammad; Heyden, Andreas

    2014-08-12

    We report the development of a quantum mechanics/molecular mechanics free energy perturbation (QM/MM-FEP) method for modeling chemical reactions at metal-water interfaces. This novel solvation scheme combines planewave density function theory (DFT), periodic electrostatic embedded cluster method (PEECM) calculations using Gaussian-type orbitals, and classical molecular dynamics (MD) simulations to obtain a free energy description of a complex metal-water system. We derive a potential of mean force (PMF) of the reaction system within the QM/MM framework. A fixed-size, finite ensemble of MM conformations is used to permit precise evaluation of the PMF of QM coordinates and its gradient defined within this ensemble. Local conformations of adsorbed reaction moieties are optimized using sequential MD-sampling and QM-optimization steps. An approximate reaction coordinate is constructed using a number of interpolated states and the free energy difference between adjacent states is calculated using the QM/MM-FEP method. By avoiding on-the-fly QM calculations and by circumventing the challenges associated with statistical averaging during MD sampling, a computational speedup of multiple orders of magnitude is realized. The method is systematically validated against the results of ab initio QM calculations and demonstrated for C-C cleavage in double-dehydrogenated ethylene glycol on a Pt (111) model surface.

  15. Computational modeling of chemical reactions and interstitial growth and remodeling involving charged solutes and solid-bound molecules.

    Science.gov (United States)

    Ateshian, Gerard A; Nims, Robert J; Maas, Steve; Weiss, Jeffrey A

    2014-10-01

    Mechanobiological processes are rooted in mechanics and chemistry, and such processes may be modeled in a framework that couples their governing equations starting from fundamental principles. In many biological applications, the reactants and products of chemical reactions may be electrically charged, and these charge effects may produce driving forces and constraints that significantly influence outcomes. In this study, a novel formulation and computational implementation are presented for modeling chemical reactions in biological tissues that involve charged solutes and solid-bound molecules within a deformable porous hydrated solid matrix, coupling mechanics with chemistry while accounting for electric charges. The deposition or removal of solid-bound molecules contributes to the growth and remodeling of the solid matrix; in particular, volumetric growth may be driven by Donnan osmotic swelling, resulting from charged molecular species fixed to the solid matrix. This formulation incorporates the state of strain as a state variable in the production rate of chemical reactions, explicitly tying chemistry with mechanics for the purpose of modeling mechanobiology. To achieve these objectives, this treatment identifies the specific theoretical and computational challenges faced in modeling complex systems of interacting neutral and charged constituents while accommodating any number of simultaneous reactions where reactants and products may be modeled explicitly or implicitly. Several finite element verification problems are shown to agree with closed-form analytical solutions. An illustrative tissue engineering analysis demonstrates tissue growth and swelling resulting from the deposition of chondroitin sulfate, a charged solid-bound molecular species. This implementation is released in the open-source program FEBio ( www.febio.org ). The availability of this framework may be particularly beneficial to optimizing tissue engineering culture systems by examining the

  16. The New Learning Ecology of One-to-One Computing Environments: Preparing Teachers for Shifting Dynamics and Relationships

    Science.gov (United States)

    Spires, Hiller A.; Oliver, Kevin; Corn, Jenifer

    2012-01-01

    Despite growing research and evaluation results on one-to-one computing environments, how these environments affect learning in schools remains underexamined. The purpose of this article is twofold: (a) to use a theoretical lens, namely a new learning ecology, to frame the dynamic changes as well as challenges that are introduced by a one-to-one…

  17. Computation Offloading Algorithm for Arbitrarily Divisible Applications in Mobile Edge Computing Environments: An OCR Case

    Directory of Open Access Journals (Sweden)

    Bo Li

    2018-05-01

    Full Text Available Divisible applications are a class of tasks whose loads can be partitioned into some smaller fractions, and each part can be executed independently by a processor. A wide variety of divisible applications have been found in the area of parallel and distributed processing. This paper addresses the problem of how to partition and allocate divisible applications to available resources in mobile edge computing environments with the aim of minimizing the completion time of the applications. A theoretical model was proposed for partitioning an entire divisible application according to the load of the application and the capabilities of available resources, and the solutions were derived in closed form. Both simulations and real experiments were carried out to justify this model.

  18. Orion Exploration Flight Test Reaction Control System Jet Interaction Heating Environment from Flight Data

    Science.gov (United States)

    White, Molly E.; Hyatt, Andrew J.

    2016-01-01

    The Orion Multi-Purpose Crew Vehicle (MPCV) Reaction Control System (RCS) is critical to guide the vehicle along the desired trajectory during re-­-entry. However, this system has a significant impact on the convective heating environment to the spacecraft. Heating augmentation from the jet interaction (JI) drives thermal protection system (TPS) material selection and thickness requirements for the spacecraft. This paper describes the heating environment from the RCS on the afterbody of the Orion MPCV during Orion's first flight test, Exploration Flight Test 1 (EFT-1). These jet plumes interact with the wake of the crew capsule and cause an increase in the convective heating environment. Not only is there widespread influence from the jet banks, there may also be very localized effects. The firing history during EFT-1 will be summarized to assess which jet bank interaction was measured during flight. Heating augmentation factors derived from the reconstructed flight data will be presented. Furthermore, flight instrumentation across the afterbody provides the highest spatial resolution of the region of influence of the individual jet banks of any spacecraft yet flown. This distribution of heating augmentation across the afterbody will be derived from the flight data. Additionally, trends with possible correlating parameters will be investigated to assist future designs and ground testing programs. Finally, the challenges of measuring JI, applying this data to future flights and lessons learned will be discussed.

  19. A computational study of pyrolysis reactions of lignin model compounds

    Science.gov (United States)

    Thomas Elder

    2010-01-01

    Enthalpies of reaction for the initial steps in the pyrolysis of lignin have been evaluated at the CBS-4m level of theory using fully substituted b-O-4 dilignols. Values for competing unimolecular decomposition reactions are consistent with results previously published for phenethyl phenyl ether models, but with lowered selectivity. Chain propagating reactions of free...

  20. Evaluation of the acute adverse reaction of contrast medium with high and moderate iodine concentration in patients undergoing computed tomography

    International Nuclear Information System (INIS)

    Nagamoto, Masashi; Gomi, Tatsuya; Terada, Hitoshi; Terada, Shigehiko; Kohda, Eiichi

    2006-01-01

    The aim of this prospective study was to evaluate and compare acute adverse reactions between contrast medium containing moderate and high concentrations of iodine in patients undergoing computed tomography (CT). A total of 945 patients undergoing enhanced CT were randomly assigned to receive one of two doses of contrast medium. We then prospectively investigated the incidence of adverse reactions. Iopamidol was used as the contrast medium, with a high concentration of 370 mgI/ml and a moderate concentration of 300 mgI/ml. The frequency of adverse reactions, such as pain at the injection site and heat sensation, were determined. Acute adverse reactions were observed in 2.4% (11/458) of the moderate-concentration group compared to 3.11% (15/482) of the high-concentration group; there was no significant difference in incidence between the two groups. Most adverse reactions were mild, and there was no significant difference in severity. One patient in the high-concentration group was seen to have a moderate adverse reaction. No correlation existed between the incidence of adverse reactions and patient characteristics such as sex, age, weight, flow amount, and flow rate. The incidence of pain was not significantly different between the two groups. In contrast, the incidence of heat sensation was significantly higher in the high-concentration group. The incidence and severity of acute adverse reactions were not significantly different between the two groups, and there were no severe adverse reactions in either group. (author)

  1. Improving Communicative Competence through Synchronous Communication in Computer-Supported Collaborative Learning Environments: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Xi Huang

    2018-01-01

    Full Text Available Computer-supported collaborative learning facilitates the extension of second language acquisition into social practice. Studies on its achievement effects speak directly to the pedagogical notion of treating communicative practice in synchronous computer-mediated communication (SCMC: real-time communication that takes place between human beings via the instrumentality of computers in forms of text, audio and video communication, such as live chat and chatrooms as socially-oriented meaning construction. This review begins by considering the adoption of social interactionist views to identify key paradigms and supportive principles of computer-supported collaborative learning. A special focus on two components of communicative competence is then presented to explore interactional variables in synchronous computer-mediated communication along with a review of research. There follows a discussion on a synthesis of interactional variables in negotiated interaction and co-construction of knowledge from psycholinguistic and social cohesion perspectives. This review reveals both possibilities and disparities of language socialization in promoting intersubjective learning and diversifying the salient use of interactively creative language in computer-supported collaborative learning environments in service of communicative competence.

  2. CyberPsychological Computation on Social Community of Ubiquitous Learning

    Science.gov (United States)

    Zhou, Xuan; Dai, Genghui; Huang, Shuang; Sun, Xuemin; Hu, Feng; Hu, Hongzhi; Ivanović, Mirjana

    2015-01-01

    Under the modern network environment, ubiquitous learning has been a popular way for people to study knowledge, exchange ideas, and share skills in the cyberspace. Existing research findings indicate that the learners' initiative and community cohesion play vital roles in the social communities of ubiquitous learning, and therefore how to stimulate the learners' interest and participation willingness so as to improve their enjoyable experiences in the learning process should be the primary consideration on this issue. This paper aims to explore an effective method to monitor the learners' psychological reactions based on their behavioral features in cyberspace and therefore provide useful references for adjusting the strategies in the learning process. In doing so, this paper firstly analyzes the psychological assessment of the learners' situations as well as their typical behavioral patterns and then discusses the relationship between the learners' psychological reactions and their observable features in cyberspace. Finally, this paper puts forward a CyberPsychological computation method to estimate the learners' psychological states online. Considering the diversity of learners' habitual behaviors in the reactions to their psychological changes, a BP-GA neural network is proposed for the computation based on their personalized behavioral patterns. PMID:26557846

  3. STEEP4 code for computation of specific thermonuclear reaction rates from pointwise cross sections

    International Nuclear Information System (INIS)

    Harris, D.R.; Dei, D.E.; Husseiny, A.A.; Sabri, Z.A.; Hale, G.M.

    1976-05-01

    A code module, STEEP4, is developed to calculate the fusion reaction rates in terms of the specific reactivity [sigma v] which is the product of cross section and relative velocity averaged over the actual ion distributions of the interacting particles in the plasma. The module is structured in a way suitable for incorporation in thermonuclear burn codes to provide rapid and yet relatively accurate on-line computation of [sigma v] as a function of plasma parameters. Ion distributions are modified to include slowing-down contributions which are characterized in terms of plasma parameters. Rapid and accurate algorithms are used for integrating [sigma v] from cross sections and spectra. The main program solves for [sigma v] by the method of steepest descent. However, options are provided to use Gauss-Hermite and dense trapezoidal quadrature integration techniques. Options are also provided for rapid calculation of screening effects on specific reaction rates. Although such effects are not significant in cases of plasmas of laboratory interest, the options are included to increase the range of applicability of the code. Gamow penetration form, log-log interpolation, and cubic interpolation routines are included to provide the interpolated values of cross sections

  4. Neutrino reactions in hot and dense matter

    Energy Technology Data Exchange (ETDEWEB)

    Lohs, Andreas

    2015-04-13

    In this thesis, neutrino reactions in hot and dense matter are studied. In particular, this work is concerned with neutrino-matter interactions that are relevant for neutrino transport in core-collapse supernovae (CCSNe). The majority of the energy from a CCSN is released in the form of neutrinos. Accurate understanding and computation of these interactions is most relevant to achieve sufficiently reliable predictions for the evolution of CCSNe and other related question such as the production of heavy elements or neutrino oscillations. For this purpose this work follows the combined approach of searching for new important neutrino reactions and improving the computation of those reactions that are already implemented. First we estimate the relevance of charged-current weak interactions that include muon-neutrinos or muons, as well as the role of neutron decay for neutrino transport in CCSNe. All of these reactions were previously neglected in CCSN-simulations. We derive and compute the matrix element and subsequent semi-analytic expressions for transport properties like the inverse mean free path of the new reactions. It is found that these reactions are important for muon neutrinos and low energy electron antineutrinos at very high densities in the protoneutron star surface. Consequently their implementation might lead to several changes in the prediction of CCSNe signatures such as the nucleosynthesis yields. Second we improve the precision in the computation of well known neutrino-nucleon reactions like neutrino absorption on neutrons. We derive semi-analytic expressions for transport properties that use less restrictive approximations while keeping the computational demand constant. Therefore we consider the full relativistic kinematics of all participating particles i.e. allowing for relativistic nucleons and finite lepton masses. Also the weak magnetism terms of the matrix elements are explicitly included to all orders. From our results we suggest that the

  5. Neutrino reactions in hot and dense matter

    International Nuclear Information System (INIS)

    Lohs, Andreas

    2015-01-01

    In this thesis, neutrino reactions in hot and dense matter are studied. In particular, this work is concerned with neutrino-matter interactions that are relevant for neutrino transport in core-collapse supernovae (CCSNe). The majority of the energy from a CCSN is released in the form of neutrinos. Accurate understanding and computation of these interactions is most relevant to achieve sufficiently reliable predictions for the evolution of CCSNe and other related question such as the production of heavy elements or neutrino oscillations. For this purpose this work follows the combined approach of searching for new important neutrino reactions and improving the computation of those reactions that are already implemented. First we estimate the relevance of charged-current weak interactions that include muon-neutrinos or muons, as well as the role of neutron decay for neutrino transport in CCSNe. All of these reactions were previously neglected in CCSN-simulations. We derive and compute the matrix element and subsequent semi-analytic expressions for transport properties like the inverse mean free path of the new reactions. It is found that these reactions are important for muon neutrinos and low energy electron antineutrinos at very high densities in the protoneutron star surface. Consequently their implementation might lead to several changes in the prediction of CCSNe signatures such as the nucleosynthesis yields. Second we improve the precision in the computation of well known neutrino-nucleon reactions like neutrino absorption on neutrons. We derive semi-analytic expressions for transport properties that use less restrictive approximations while keeping the computational demand constant. Therefore we consider the full relativistic kinematics of all participating particles i.e. allowing for relativistic nucleons and finite lepton masses. Also the weak magnetism terms of the matrix elements are explicitly included to all orders. From our results we suggest that the

  6. Applications of computer simulation, nuclear reactions and elastic scattering to surface analysis of materials

    Directory of Open Access Journals (Sweden)

    Pacheco de Carvalho, J. A.

    2008-08-01

    Full Text Available This article involves computer simulation and surface analysis by nuclear techniques, which are non-destructive. Both the “energy method of analysis” for nuclear reactions and elastic scattering are used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. The method is successfully applied to thick flat targets of graphite, quartz and sapphire and targets containing thin films of aluminium oxide. Depth profiles of 12C and 16O nuclei are determined using (d,p and (d,α deuteron induced reactions. Rutherford and resonance elastic scattering of (4He+ ions are also used.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energia” para reacciones nucleares, así como el de difusión elástica. Se simulan en ordenador espectros en energía que se comparan com datos experimentales, de lo que resulta la obención de información sobre la composición y los perfiles de concentración de la muestra. Este método se aplica con éxito em muestras espesas y planas de grafito, cuarzo y zafiro y muestras conteniendo películas finas de óxido de aluminio. Se calculan perfiles en profundidad de núcleos de 12C y de 16O a través de reacciones (d,p y (d,α inducidas por deuterones. Se utiliza también la difusión elástica de iones (4He+, tanto a Rutherford como resonante.

  7. Modeling Chemical Reactions by QM/MM Calculations: The Case of the Tautomerization in Fireflies Bioluminescent Systems.

    Science.gov (United States)

    Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle

    2018-01-01

    In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modeling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods.

  8. Modelling chemical reactions by QM/MM calculations: the case of the tautomerization in fireflies bioluminescent systems

    Science.gov (United States)

    Berraud-Pache, Romain; Garcia-Iriepa, Cristina; Navizet, Isabelle

    2018-04-01

    In less than half a century, the hybrid QM/MM method has become one of the most used technique to model molecules embedded in a complex environment. A well-known application of the QM/MM method is for biological systems. Nowadays, one can understand how enzymatic reactions work or compute spectroscopic properties, like the wavelength of emission. Here, we have tackled the issue of modelling chemical reactions inside proteins. We have studied a bioluminescent system, fireflies, and deciphered if a keto-enol tautomerization is possible inside the protein. The two tautomers are candidates to be the emissive molecule of the bioluminescence but no outcome has been reached. One hypothesis is to consider a possible keto-enol tautomerization to treat this issue, as it has been already observed in water. A joint approach combining extensive MD simulations as well as computation of key intermediates like TS using QM/MM calculations is presented in this publication. We also emphasize the procedure and difficulties met during this approach in order to give a guide for this kind of chemical reactions using QM/MM methods.

  9. Social interaction in type 2 diabetes computer-mediated environments: How inherent features of the channels influence peer-to-peer interaction.

    Science.gov (United States)

    Lewinski, Allison A; Fisher, Edwin B

    2016-06-01

    Interventions via the internet provide support to individuals managing chronic illness. The purpose of this integrative review was to determine how the features of a computer-mediated environment influence social interactions among individuals with type 2 diabetes. A combination of MeSH and keyword terms, based on the cognates of three broad groupings: social interaction, computer-mediated environments, and chronic illness, was used to search the PubMed, PsychInfo, Sociology Research Database, and Cumulative Index to Nursing and Allied Health Literature databases. Eleven articles met the inclusion criteria. Computer-mediated environments enhance an individual's ability to interact with peers while increasing the convenience of obtaining personalized support. A matrix, focused on social interaction among peers, identified themes across all articles, and five characteristics emerged: (1) the presence of synchronous and asynchronous communication, (2) the ability to connect with similar peers, (3) the presence or absence of a moderator, (4) personalization of feedback regarding individual progress and self-management, and (5) the ability of individuals to maintain choice during participation. Individuals interact with peers to obtain relevant, situation-specific information and knowledge about managing their own care. Computer-mediated environments facilitate the ability of individuals to exchange this information despite temporal or geographical barriers that may be present, thus improving T2D self-management. © The Author(s) 2015.

  10. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  11. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  12. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  13. Effects Of Social Networking Sites (SNSs) On Hyper Media Computer Mediated Environments (HCMEs)

    OpenAIRE

    Yoon C. Cho

    2011-01-01

    Social Networking Sites (SNSs) are known as tools to interact and build relationships between users/customers in Hyper Media Computer Mediated Environments (HCMEs). This study explored how social networking sites play a significant role in communication between users. While numerous researchers examined the effectiveness of social networking websites, few studies investigated which factors affected customers attitudes and behavior toward social networking sites. In this paper, the authors inv...

  14. Anaerobic Microbial Degradation of Hydrocarbons: From Enzymatic Reactions to the Environment.

    Science.gov (United States)

    Rabus, Ralf; Boll, Matthias; Heider, Johann; Meckenstock, Rainer U; Buckel, Wolfgang; Einsle, Oliver; Ermler, Ulrich; Golding, Bernard T; Gunsalus, Robert P; Kroneck, Peter M H; Krüger, Martin; Lueders, Tillmann; Martins, Berta M; Musat, Florin; Richnow, Hans H; Schink, Bernhard; Seifert, Jana; Szaleniec, Maciej; Treude, Tina; Ullmann, G Matthias; Vogt, Carsten; von Bergen, Martin; Wilkes, Heinz

    2016-01-01

    Hydrocarbons are abundant in anoxic environments and pose biochemical challenges to their anaerobic degradation by microorganisms. Within the framework of the Priority Program 1319, investigations funded by the Deutsche Forschungsgemeinschaft on the anaerobic microbial degradation of hydrocarbons ranged from isolation and enrichment of hitherto unknown hydrocarbon-degrading anaerobic microorganisms, discovery of novel reactions, detailed studies of enzyme mechanisms and structures to process-oriented in situ studies. Selected highlights from this program are collected in this synopsis, with more detailed information provided by theme-focused reviews of the special topic issue on 'Anaerobic biodegradation of hydrocarbons' [this issue, pp. 1-244]. The interdisciplinary character of the program, involving microbiologists, biochemists, organic chemists and environmental scientists, is best exemplified by the studies on alkyl-/arylalkylsuccinate synthases. Here, research topics ranged from in-depth mechanistic studies of archetypical toluene-activating benzylsuccinate synthase, substrate-specific phylogenetic clustering of alkyl-/arylalkylsuccinate synthases (toluene plus xylenes, p-cymene, p-cresol, 2-methylnaphthalene, n-alkanes), stereochemical and co-metabolic insights into n-alkane-activating (methylalkyl)succinate synthases to the discovery of bacterial groups previously unknown to possess alkyl-/arylalkylsuccinate synthases by means of functional gene markers and in situ field studies enabled by state-of-the-art stable isotope probing and fractionation approaches. Other topics are Mo-cofactor-dependent dehydrogenases performing O2-independent hydroxylation of hydrocarbons and alkyl side chains (ethylbenzene, p-cymene, cholesterol, n-hexadecane), degradation of p-alkylated benzoates and toluenes, glycyl radical-bearing 4-hydroxyphenylacetate decarboxylase, novel types of carboxylation reactions (for acetophenone, acetone, and potentially also benzene and

  15. Load Balancing in Cloud Computing Environment Using Improved Weighted Round Robin Algorithm for Nonpreemptive Dependent Tasks

    OpenAIRE

    Devi, D. Chitra; Uthariaraj, V. Rhymend

    2016-01-01

    Cloud computing uses the concepts of scheduling and load balancing to migrate tasks to underutilized VMs for effectively sharing the resources. The scheduling of the nonpreemptive tasks in the cloud computing environment is an irrecoverable restraint and hence it has to be assigned to the most appropriate VMs at the initial placement itself. Practically, the arrived jobs consist of multiple interdependent tasks and they may execute the independent tasks in multiple VMs or in the same VM’s mul...

  16. Modeling non-adiabatic photoexcited reaction dynamics in condensed phases

    International Nuclear Information System (INIS)

    Coker, D.F.

    2003-01-01

    Reactions of photoexcited molecules, ions, and radicals in condensed phase environments involve non-adiabatic dynamics over coupled electronic surfaces. We focus on how local environmental symmetries can effect non-adiabatic coupling between excited electronic states and thus influence, in a possibly controllable way, the outcome of photo-excited reactions. Semi-classical and mixed quantum-classical non-adiabatic molecular dynamics methods, together with semi-empirical excited state potentials are used to probe the dynamical mixing of electronic states in different environments from molecular clusters, to simple liquids and solids, and photo-excited reactions in complex reaction environments such as zeolites

  17. Genotype by environment interaction for 450-day weight of Nelore cattle analyzed by reaction norm models

    Directory of Open Access Journals (Sweden)

    Newton T. Pégolo

    2009-01-01

    Full Text Available Genotype by environment interactions (GEI have attracted increasing attention in tropical breeding programs because of the variety of production systems involved. In this work, we assessed GEI in 450-day adjusted weight (W450 Nelore cattle from 366 Brazilian herds by comparing traditional univariate single-environment model analysis (UM and random regression first order reaction norm models for six environmental variables: standard deviations of herd-year (RRMw and herd-year-season-management (RRMw-m groups for mean W450, standard deviations of herd-year (RRMg and herd-year-season-management (RRMg-m groups adjusted for 365-450 days weight gain (G450 averages, and two iterative algorithms using herd-year-season-management group solution estimates from a first RRMw-m and RRMg-m analysis (RRMITw-m and RRMITg-m, respectively. The RRM results showed similar tendencies in the variance components and heritability estimates along environmental gradient. Some of the variation among RRM estimates may have been related to the precision of the predictor and to correlations between environmental variables and the likely components of the weight trait. GEI, which was assessed by estimating the genetic correlation surfaces, had values < 0.5 between extreme environments in all models. Regression analyses showed that the correlation between the expected progeny differences for UM and the corresponding differences estimated by RRM was higher in intermediate and favorable environments than in unfavorable environments (p < 0.0001.

  18. The evolution of environmental and genetic sex determination in fluctuating environments.

    Science.gov (United States)

    Van Dooren, Tom J M; Leimar, Olof

    2003-12-01

    Twenty years ago, Bulmer and Bull suggested that disruptive selection, produced by environmental fluctuations, can result in an evolutionary transition from environmental sex determination (ESD) to genetic sex determination (GSD). We investigated the feasibility of such a process, using mutation-limited adaptive dynamics and individual-based computer simulations. Our model describes the evolution of a reaction norm for sex determination in a metapopulation setting with partial migration and variation in an environmental variable both within and between local patches. The reaction norm represents the probability of becoming a female as a function of environmental state and was modeled as a sigmoid function with two parameters, one giving the location (i.e., the value of the environmental variable for which an individual has equal chance of becoming either sex) and the other giving the slope of the reaction norm for that environment. The slope can be interpreted as being set by the level of developmental noise in morph determination, with less noise giving a steeper slope and a more switchlike reaction norm. We found convergence stable reaction norms with intermediate to large amounts of developmental noise for conditions characterized by low migration rates, small differential competitive advantages between the sexes over environments, and little variation between individual environments within patches compared to variation between patches. We also considered reaction norms with the slope parameter constrained to a high value, corresponding to little developmental noise. For these we found evolutionary branching in the location parameter and a transition from ESD toward GSD, analogous to the original analysis by Bulmer and Bull. Further evolutionary change, including dominance evolution, produced a polymorphism acting as a GSD system with heterogamety. Our results point to the role of developmental noise in the evolution of sex determination.

  19. Students' Perceptions of Computer-Based Learning Environments, Their Attitude towards Business Statistics, and Their Academic Achievement: Implications from a UK University

    Science.gov (United States)

    Nguyen, ThuyUyen H.; Charity, Ian; Robson, Andrew

    2016-01-01

    This study investigates students' perceptions of computer-based learning environments, their attitude towards business statistics, and their academic achievement in higher education. Guided by learning environments concepts and attitudinal theory, a theoretical model was proposed with two instruments, one for measuring the learning environment and…

  20. Computational and experimental studies on stabilities, reactions and reaction rates of cations and ion-dipole complexes

    NARCIS (Netherlands)

    Ervasti, H.K.

    2008-01-01

    In this thesis, ion stability, ion-molecule reactions and reaction rates are studied using mass spectrometry and molecular modelling. In Chapter 2 the effect of functional group substitution on neutral and ionised ketene are studied. Electron-donating substituents show a stabilising positive

  1. Genotype-environment interaction of maternal influence characteristics in Nellore cattle bred in the Brazilian humid tropical regions by reaction norm

    Directory of Open Access Journals (Sweden)

    Jorge Luís Ferreira

    2015-08-01

    Full Text Available Reaction Norm (RN is the study of genotype-environment interaction (GxE that complies with alternative ways of genotypes within different environments. This study was carried out to verify GxE by a reaction norm model of weights at 120 (W120 and 210 (W210 days of age in Nellore cattle raised in the Humid Tropical Regions of Brazil. Environmental gradients were obtained by solutions of contemporary groups which were fitted as co-variables in the random regression model via reaction norms. Mean weight at 120 days of age was 127.97 kg, and environmental gradients ranged between -27 and +26 kg. Average was 185.60 kg at 210 days of age and gradients ranged from -54 to +55 kg. Scale changes in the breeding values and heritability estimates occurred along the gradients for the two weights; the genetic correlations between breeding value breeding values were also similar for both weights. These correlations were high between the close gradients, and low to even negative between extreme environments. Slopes representing the environmental sensitivity were high, with changes of scale and changes in classification of ten bulls with a great numbers of calves for the two traits. When regression slopes of the ten bulls with the highest breeding value breeding values were evaluated, these values were different in W120 from those in W210, perhaps due to the greater influence of maternal effect on W120. These results characterize the influence of GxE on the pre-weaning weights of animals in the humid tropical regions of Brazil. Due to this, it is possible to get greater precision on the predictions of the animals breeding values breeding value. A less biased selection and a greater genetic progress occurred.

  2. Deciphering Selectivity in Organic Reactions: A Multifaceted Problem.

    Science.gov (United States)

    Balcells, David; Clot, Eric; Eisenstein, Odile; Nova, Ainara; Perrin, Lionel

    2016-05-17

    Computational chemistry has made a sustained contribution to the understanding of chemical reactions. In earlier times, half a century ago, the goal was to distinguish allowed from forbidden reactions (e.g., Woodward-Hoffmann rules), that is, reactions with low or high to very high activation barriers. A great achievement of computational chemistry was also to contribute to the determination of structures with the bonus of proposing a rationalization (e.g., anomeric effect, isolobal analogy, Gillespie valence shell pair electron repulsion rules and counter examples, Wade-Mingos rules for molecular clusters). With the development of new methods and the constant increase in computing power, computational chemists move to more challenging problems, close to the daily concerns of the experimental chemists, in determining the factors that make a reaction both efficient and selective: a key issue in organic synthesis. For this purpose, experimental chemists use advanced synthetic and analytical techniques to which computational chemists added other ways of determining reaction pathways. The transition states and intermediates contributing to the transformation of reactants into the desired and undesired products can now be determined, including their geometries, energies, charges, spin densities, spectroscopy properties, etc. Such studies remain challenging due to the large number of chemical species commonly present in the reactive media whose role may have to be determined. Calculating chemical systems as they are in the experiment is not always possible, bringing its own share of complexity through the large number of atoms and the associated large number of conformers to consider. Modeling the chemical species with smaller systems is an alternative that historically led to artifacts. Another important topic is the choice of the computational method. While DFT is widely used, the vast diversity of functionals available is both an opportunity and a challenge. Though

  3. Development of an international matrix-solver prediction system on a French-Japanese international grid computing environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Kushida, Noriyuki; Tatekawa, Takayuki; Teshima, Naoya; Caniou, Yves; Guivarch, Ronan; Dayde, Michel; Ramet, Pierre

    2010-01-01

    The 'Research and Development of International Matrix-Solver Prediction System (REDIMPS)' project aimed at improving the TLSE sparse linear algebra expert website by establishing an international grid computing environment between Japan and France. To help users in identifying the best solver or sparse linear algebra tool for their problems, we have developed an interoperable environment between French and Japanese grid infrastructures (respectively managed by DIET and AEGIS). Two main issues were considered. The first issue is how to submit a job from DIET to AEGIS. The second issue is how to bridge the difference of security between DIET and AEGIS. To overcome these issues, we developed APIs to communicate between different grid infrastructures by improving the client API of AEGIS. By developing a server deamon program (SeD) of DIET which behaves like an AEGIS user, DIET can call functions in AEGIS: authentication, file transfer, job submission, and so on. To intensify the security, we also developed functionalities to authenticate DIET sites and DIET users in order to access AEGIS computing resources. By this study, the set of software and computers available within TLSE to find an appropriate solver is enlarged over France (DIET) and Japan (AEGIS). (author)

  4. [On the influence of local molecular environment on the redox potential of electron transfer cofactors in bacterial photosynthetic reaction centers].

    Science.gov (United States)

    Krasil'nikov, P M; Noks, P P; Rubin, A B

    2011-01-01

    The addition of cryosolvents (glycerol, dimethylsulfoxide) to a water solution containing bacterial photosynthetic reaction centers changes the redox potential of the bacteriochlorophyll dimer, but does not affect the redox potential of the quinone primary acceptor. It has been shown that the change in redox potential can be produced by changes of the electrostatic interactions between cofactors and the local molecular environment modified by additives entered into the solution. The degree of influence of a solvent on the redox potential of various cofactors is determined by degree of availability of these cofactors for molecules of solvent, which depends on the arrangement of cofactors in the structure of reaction centers.

  5. A general-purpose development environment for intelligent computer-aided training systems

    Science.gov (United States)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  6. Understanding Student Retention in Computer Science Education: The Role of Environment, Gains, Barriers and Usefulness

    Science.gov (United States)

    Giannakos, Michail N.; Pappas, Ilias O.; Jaccheri, Letizia; Sampson, Demetrios G.

    2017-01-01

    Researchers have been working to understand the high dropout rates in computer science (CS) education. Despite the great demand for CS professionals, little is known about what influences individuals to complete their CS studies. We identify gains of studying CS, the (learning) environment, degree's usefulness, and barriers as important predictors…

  7. The Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Argumentation Skills

    Science.gov (United States)

    Hsu, P. -S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2015-01-01

    The purpose of this quasi-experimental study was to explore how seventh graders in a suburban school in the United States developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application. A total of 54 students (three classes) comprised this treatment…

  8. Proceedings of the 1993 Conference on Intelligent Computer-Aided Training and Virtual Environment Technology, Volume 1

    Science.gov (United States)

    Hyde, Patricia R.; Loftin, R. Bowen

    1993-01-01

    These proceedings are organized in the same manner as the conference's contributed sessions, with the papers grouped by topic area. These areas are as follows: VE (virtual environment) training for Space Flight, Virtual Environment Hardware, Knowledge Aquisition for ICAT (Intelligent Computer-Aided Training) & VE, Multimedia in ICAT Systems, VE in Training & Education (1 & 2), Virtual Environment Software (1 & 2), Models in ICAT systems, ICAT Commercial Applications, ICAT Architectures & Authoring Systems, ICAT Education & Medical Applications, Assessing VE for Training, VE & Human Systems (1 & 2), ICAT Theory & Natural Language, ICAT Applications in the Military, VE Applications in Engineering, Knowledge Acquisition for ICAT, and ICAT Applications in Aerospace.

  9. Large Reactional Osteogenesis in Maxillary Sinus Associated with Secondary Root Canal Infection Detected Using Cone-beam Computed Tomography.

    Science.gov (United States)

    Estrela, Carlos; Porto, Olavo César Lyra; Costa, Nádia Lago; Garrote, Marcel da Silva; Decurcio, Daniel Almeida; Bueno, Mike R; Silva, Brunno Santos de Freitas

    2015-12-01

    Inflammatory injuries in the maxillary sinus may originate from root canal infections and lead to bone resorption or regeneration. This report describes the radiographic findings of 4 asymptomatic clinical cases of large reactional osteogenesis in the maxillary sinus (MS) associated with secondary root canal infection detected using cone-beam computed tomographic (CBCT) imaging. Apical periodontitis, a consequence of root canal infection, may lead to a periosteal reaction in the MS and osteogenesis seen as a radiopaque structure on imaging scans. The use of a map-reading strategy for the longitudinal and sequential slices of CBCT images may contribute to the definition of diagnoses and treatment plans. Root canal infections may lead to reactional osteogenesis in the MS. High-resolution CBCT images may reveal changes that go unnoticed when using conventional imaging. Findings may help define initial diagnoses and therapeutic plans, but only histopathology provides a definitive diagnosis. Surgical enucleation of the periapical lesion is recommended if nonsurgical root canal treatment fails to control apical periodontitis. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  10. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    Science.gov (United States)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  11. FAST - A multiprocessed environment for visualization of computational fluid dynamics

    International Nuclear Information System (INIS)

    Bancroft, G.V.; Merritt, F.J.; Plessel, T.C.; Kelaita, P.G.; Mccabe, R.K.

    1991-01-01

    The paper presents the Flow Analysis Software Toolset (FAST) to be used for fluid-mechanics analysis. The design criteria for FAST including the minimization of the data path in the computational fluid-dynamics (CFD) process, consistent user interface, extensible software architecture, modularization, and the isolation of three-dimensional tasks from the application programmer are outlined. Each separate process communicates through the FAST Hub, while other modules such as FAST Central, NAS file input, CFD calculator, surface extractor and renderer, titler, tracer, and isolev might work together to generate the scene. An interprocess communication package making it possible for FAST to operate as a modular environment where resources could be shared among different machines as well as a single host is discussed. 20 refs

  12. Intramolecular Diels-Alder Reactions in Organic Synthesis

    OpenAIRE

    Sizemore, Nicholas Blandford Luke

    2014-01-01

    Intramolecular Diels-Alder (IMDA) reactions are an important class of reactions in synthetic organic chemistry for the rapid construction of polycyclic frameworks. Three classes of IMDA reactions were investigated synthetically and computationally: 1) all-carbon type 1 IMDA reactions, 2) N-acylnitroso type 2 IMDA reactions, and 3) cyano-azadiene IMDA reactions. The first class was implemented in research toward the total synthesis of maoecrystal Z and isopalhinine A. The second class was stud...

  13. Direct and inverse reactions of LiH+ with He(1S) from quantum calculations: mechanisms and rates.

    Science.gov (United States)

    Tacconi, M; Bovino, S; Gianturco, F A

    2012-01-14

    The gas-phase reaction of LiH(+) (X(2)Σ) with He((1)S) atoms, yielding Li(+)He with a small endothermicity for the rotovibrational ground state of the reagents, is analysed using the quantum reactive approach that employs the Negative Imaginary Potential (NIP) scheme discussed earlier in the literature. The dependence of low-T rates on the initial vibrational state of LiH(+) is analysed and the role of low-energy Feshbach resonances is also discussed. The inverse destruction reaction of LiHe(+), a markedly exothermic process, is also investigated and the rates are computed in the same range of temperatures. The possible roles of these reactions in early universe astrophysical networks, in He droplets environments or in cold traps are briefly discussed.

  14. Materials and Life Science Experimental Facility at the Japan Proton Accelerator Research Complex III: Neutron Devices and Computational and Sample Environments

    Directory of Open Access Journals (Sweden)

    Kaoru Sakasai

    2017-08-01

    Full Text Available Neutron devices such as neutron detectors, optical devices including supermirror devices and 3He neutron spin filters, and choppers are successfully developed and installed at the Materials Life Science Facility (MLF of the Japan Proton Accelerator Research Complex (J-PARC, Tokai, Japan. Four software components of MLF computational environment, instrument control, data acquisition, data analysis, and a database, have been developed and equipped at MLF. MLF also provides a wide variety of sample environment options including high and low temperatures, high magnetic fields, and high pressures. This paper describes the current status of neutron devices, computational and sample environments at MLF.

  15. Optimizing the Use of Storage Systems Provided by Cloud Computing Environments

    Science.gov (United States)

    Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.

    2013-12-01

    Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and

  16. Intramolecular Diels-Alder reactions of pyrimidines, a synthetic and computational study

    NARCIS (Netherlands)

    Stolle, W.A.W.

    1992-01-01

    This thesis deals with an investigation on the ringtransformation reactions of 2and 5-(ω-alkynyl)pyrimidine derivatives, which undergo upon heating an intramolecular Diels-Alder reaction and subsequently a spontaneous retro Diels- Alder reaction. To get a better insight into the

  17. A simple interface to computational fluid dynamics programs for building environment simulations

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, III, C R; Chen, Q [Massachusetts Institute of Technology, Cambridge, MA (United States)

    2000-07-01

    It is becoming a popular practice for architects and HVAC engineers to simulate airflow in and around buildings by computational fluid dynamics (CFD) methods in order to predict indoor and outdoor environment. However, many CFD programs are crippled by a historically poor and inefficient user interface system, particularly for users with little training in numerical simulation. This investigation endeavors to create a simplified CFD interface (SCI) that allows architects and buildings engineers to use CFD without excessive training. The SCI can be easily integrated into new CFD programs. (author)

  18. Transport processes in exothermic gas-solid reactions

    International Nuclear Information System (INIS)

    Vijay, P.L.; Sathiyamoorthy, D.

    1997-01-01

    The variation of the concentration of gaseous reactant, temperature distribution for an exothermic reaction, the diffusivity factor and the reaction ratio profiles with various radial positions of a solid reactant have been computed and illustrated for a specific case of reduction reaction of UO 3 by hydrogen

  19. Multiphasic Reaction Modeling for Polypropylene Production in a Pilot-Scale Catalytic Reactor

    Directory of Open Access Journals (Sweden)

    Mohammad Jakir Hossain Khan

    2016-06-01

    Full Text Available In this study, a novel multiphasic model for the calculation of the polypropylene production in a complicated hydrodynamic and the physiochemical environments has been formulated, confirmed and validated. This is a first research attempt that describes the development of the dual-phasic phenomena, the impact of the optimal process conditions on the production rate of polypropylene and the fluidized bed dynamic details which could be concurrently obtained after solving the model coupled with the CFD (computational fluid dynamics model, the basic mathematical model and the moment equations. Furthermore, we have established the quantitative relationship between the operational condition and the dynamic gas–solid behavior in actual reaction environments. Our results state that the proposed model could be applied for generalizing the production rate of the polymer from a chemical procedure to pilot-scale chemical reaction engineering. However, it was assumed that the solids present in the bubble phase and the reactant gas present in the emulsion phase improved the multiphasic model, thus taking into account that the polymerization took place mutually in the emulsion besides the bubble phase. It was observed that with respect to the experimental extent of the superficial gas velocity and the Ziegler-Natta feed rate, the ratio of the polymer produced as compared to the overall rate of production was approximately in the range of 9%–11%. This is a significant amount and it should not be ignored. We also carried out the simulation studies for comparing the data of the CFD-dependent dual-phasic model, the emulsion phase model, the dynamic bubble model and the experimental results. It was noted that the improved dual-phasic model and the CFD model were able to predict more constricted and safer windows at similar conditions as compared to the experimental results. Our work is unique, as the integrated developed model is able to offer clearer ideas

  20. A review of reaction rates and thermodynamic and transport properties for the 11-species air model for chemical and thermal nonequilibrium calculations to 30000 K

    Science.gov (United States)

    Gupta, Roop N.; Yos, Jerrold M.; Thompson, Richard A.

    1989-01-01

    Reaction rate coefficients and thermodynamic and transport properties are provided for the 11-species air model which can be used for analyzing flows in chemical and thermal nonequilibrium. Such flows will likely occur around currently planned and future hypersonic vehicles. Guidelines for determining the state of the surrounding environment are provided. Approximate and more exact formulas are provided for computing the properties of partially ionized air mixtures in such environments.

  1. A Multilevel Adaptive Reaction-splitting Simulation Method for Stochastic Reaction Networks

    KAUST Repository

    Moraes, Alvaro; Tempone, Raul; Vilanova, Pedro

    2016-01-01

    In this work, we present a novel multilevel Monte Carlo method for kinetic simulation of stochastic reaction networks characterized by having simultaneously fast and slow reaction channels. To produce efficient simulations, our method adaptively classifies the reactions channels into fast and slow channels. To this end, we first introduce a state-dependent quantity named level of activity of a reaction channel. Then, we propose a low-cost heuristic that allows us to adaptively split the set of reaction channels into two subsets characterized by either a high or a low level of activity. Based on a time-splitting technique, the increments associated with high-activity channels are simulated using the tau-leap method, while those associated with low-activity channels are simulated using an exact method. This path simulation technique is amenable for coupled path generation and a corresponding multilevel Monte Carlo algorithm. To estimate expected values of observables of the system at a prescribed final time, our method bounds the global computational error to be below a prescribed tolerance, TOL, within a given confidence level. This goal is achieved with a computational complexity of order O(TOL-2), the same as with a pathwise-exact method, but with a smaller constant. We also present a novel low-cost control variate technique based on the stochastic time change representation by Kurtz, showing its performance on a numerical example. We present two numerical examples extracted from the literature that show how the reaction-splitting method obtains substantial gains with respect to the standard stochastic simulation algorithm and the multilevel Monte Carlo approach by Anderson and Higham. © 2016 Society for Industrial and Applied Mathematics.

  2. A Multilevel Adaptive Reaction-splitting Simulation Method for Stochastic Reaction Networks

    KAUST Repository

    Moraes, Alvaro

    2016-07-07

    In this work, we present a novel multilevel Monte Carlo method for kinetic simulation of stochastic reaction networks characterized by having simultaneously fast and slow reaction channels. To produce efficient simulations, our method adaptively classifies the reactions channels into fast and slow channels. To this end, we first introduce a state-dependent quantity named level of activity of a reaction channel. Then, we propose a low-cost heuristic that allows us to adaptively split the set of reaction channels into two subsets characterized by either a high or a low level of activity. Based on a time-splitting technique, the increments associated with high-activity channels are simulated using the tau-leap method, while those associated with low-activity channels are simulated using an exact method. This path simulation technique is amenable for coupled path generation and a corresponding multilevel Monte Carlo algorithm. To estimate expected values of observables of the system at a prescribed final time, our method bounds the global computational error to be below a prescribed tolerance, TOL, within a given confidence level. This goal is achieved with a computational complexity of order O(TOL-2), the same as with a pathwise-exact method, but with a smaller constant. We also present a novel low-cost control variate technique based on the stochastic time change representation by Kurtz, showing its performance on a numerical example. We present two numerical examples extracted from the literature that show how the reaction-splitting method obtains substantial gains with respect to the standard stochastic simulation algorithm and the multilevel Monte Carlo approach by Anderson and Higham. © 2016 Society for Industrial and Applied Mathematics.

  3. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    Directory of Open Access Journals (Sweden)

    Almeida Jonas S

    2006-03-01

    Full Text Available Abstract Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else. Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web

  4. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    Science.gov (United States)

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  5. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  6. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  7. Agent-based simulation of reactions in the crowded and structured intracellular environment: Influence of mobility and location of the reactants

    Directory of Open Access Journals (Sweden)

    Lapin Alexei

    2011-05-01

    Full Text Available Abstract Background In this paper we apply a novel agent-based simulation method in order to model intracellular reactions in detail. The simulations are performed within a virtual cytoskeleton enriched with further crowding elements, which allows the analysis of molecular crowding effects on intracellular diffusion and reaction rates. The cytoskeleton network leads to a reduction in the mobility of molecules. Molecules can also unspecifically bind to membranes or the cytoskeleton affecting (i the fraction of unbound molecules in the cytosol and (ii furthermore reducing the mobility. Binding of molecules to intracellular structures or scaffolds can in turn lead to a microcompartmentalization of the cell. Especially the formation of enzyme complexes promoting metabolic channeling, e.g. in glycolysis, depends on the co-localization of the proteins. Results While the co-localization of enzymes leads to faster reaction rates, the reduced mobility decreases the collision rate of reactants, hence reducing the reaction rate, as expected. This effect is most prominent in diffusion limited reactions. Furthermore, anomalous diffusion can occur due to molecular crowding in the cell. In the context of diffusion controlled reactions, anomalous diffusion leads to fractal reaction kinetics. The simulation framework is used to quantify and separate the effects originating from molecular crowding or the reduced mobility of the reactants. We were able to define three factors which describe the effective reaction rate, namely f diff for the diffusion effect, f volume for the crowding, and f access for the reduced accessibility of the molecules. Conclusions Molecule distributions, reaction rate constants and structural parameters can be adjusted separately in the simulation allowing a comprehensive study of individual effects in the context of a realistic cell environment. As such, the present simulation can help to bridge the gap between in vivo and in vitro

  8. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    Science.gov (United States)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  9. Computer-assisted design for scaling up systems based on DNA reaction networks.

    Science.gov (United States)

    Aubert, Nathanaël; Mosca, Clément; Fujii, Teruo; Hagiya, Masami; Rondelez, Yannick

    2014-04-06

    In the past few years, there have been many exciting advances in the field of molecular programming, reaching a point where implementation of non-trivial systems, such as neural networks or switchable bistable networks, is a reality. Such systems require nonlinearity, be it through signal amplification, digitalization or the generation of autonomous dynamics such as oscillations. The biochemistry of DNA systems provides such mechanisms, but assembling them in a constructive manner is still a difficult and sometimes counterintuitive process. Moreover, realistic prediction of the actual evolution of concentrations over time requires a number of side reactions, such as leaks, cross-talks or competitive interactions, to be taken into account. In this case, the design of a system targeting a given function takes much trial and error before the correct architecture can be found. To speed up this process, we have created DNA Artificial Circuits Computer-Assisted Design (DACCAD), a computer-assisted design software that supports the construction of systems for the DNA toolbox. DACCAD is ultimately aimed to design actual in vitro implementations, which is made possible by building on the experimental knowledge available on the DNA toolbox. We illustrate its effectiveness by designing various systems, from Montagne et al.'s Oligator or Padirac et al.'s bistable system to new and complex networks, including a two-bit counter or a frequency divider as well as an example of very large system encoding the game Mastermind. In the process, we highlight a variety of behaviours, such as enzymatic saturation and load effect, which would be hard to handle or even predict with a simpler model. We also show that those mechanisms, while generally seen as detrimental, can be used in a positive way, as functional part of a design. Additionally, the number of parameters included in these simulations can be large, especially in the case of complex systems. For this reason, we included the

  10. Charged-particle thermonuclear reaction rates: IV. Comparison to previous work

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.

    2010-01-01

    We compare our Monte Carlo reaction rates (see Paper II of this issue) to previous results that were obtained by using the classical method of computing thermonuclear reaction rates. For each reaction, the comparison is presented using two types of graphs: the first shows the change in reaction rate uncertainties, while the second displays our new results normalized to the previously recommended reaction rate. We find that the rates have changed significantly for almost all reactions considered here. The changes are caused by (i) our new Monte Carlo method of computing reaction rates (see Paper I of this issue), and (ii) newly available nuclear physics information (see Paper III of this issue).

  11. One–pot synthesis and electrochemical properties of polyaniline nanofibers through simply tuning acid–base environment of reaction medium

    International Nuclear Information System (INIS)

    Li, Tao; Zhou, Yi; Liang, Banglei; Jin, Dandan; Liu, Na; Qin, Zongyi; Zhu, Meifang

    2017-01-01

    Highlights: •Presenting a facile one–pot approach to prepare polyaniline nanofibers through simply tuning acid–base environment of reaction medium. •Determining the role of aniline oligomers play in the formation of polyaniline nanofibers. •Demonstrating the feasibility of polyaniline nanofibers as high–performance electrode materials for supercapacitors. -- Abstract: A facile and efficient one–pot approach was presented to prepare polyaniline (PANi) nanofibers through simply tuning acid–base environment of reaction medium without the assistance of templates or use of organic solvents, in which aniline oligomers formed in the alkaline solution were used as “seeds” for the oriented growth of PANi chains under acidic conditions. The as–prepared PANi nanofibers were investigated by field–emission scanning electron microscopy, ultraviolet–visible spectroscopy, Fourier transform infrared spectroscopy and X–ray diffraction technology. Furthermore, the electrochemical properties were evaluated by cyclic voltammetry, galvanostatic charge–discharge test, and electrochemical impedance spectroscopy. More attentions were paid to the influence of aniline concentrations in alkaline and acidic reaction medium on the morphology, microstructure and properties of PANi nanofibers. It can be found that aniline concentration in alkaline medium has a stronger impact on the electrical and electrochemical properties of final products, however, their morphologies obviously depend on aniline concentration in acidic solution. Moreover, PANi nanofibers prepared at aniline concentrations of 48 mM in alkaline medium and 0.2 M in acidic medium exhibits the largest specific capacitance of 857.2 F g −1 at the scan rate of 5 mV s −1 , and capacitance retention of 63.8% after 500 cycles. It is demonstrated that such one–pot approach can present a low cost and environmental friendly route to fabricate PANi nanofibers in fully aqueous solution as high

  12. Visual Perspectives within Educational Computer Games: Effects on Presence and Flow within Virtual Immersive Learning Environments

    Science.gov (United States)

    Scoresby, Jon; Shelton, Brett E.

    2011-01-01

    The mis-categorizing of cognitive states involved in learning within virtual environments has complicated instructional technology research. Further, most educational computer game research does not account for how learning activity is influenced by factors of game content and differences in viewing perspectives. This study is a qualitative…

  13. All-oxide Raman-active traps for light and matter: probing redox homeostasis model reactions in aqueous environment.

    Science.gov (United States)

    Alessandri, Ivano; Depero, L E

    2014-04-09

    Core-shell colloidal crystals can act as very efficient traps for light and analytes. Here it is shown that Raman-active probes can be achieved using SiO2-TiO2 core-shell beads. These systems are successfully tested in monitoring of glutathione redox cycle at physiological concentration in aqueous environment, without need of any interfering enhancers. These materials represent a promising alternative to conventional, metal-based SERS probes for investigating chemical and biochemical reactions under real working conditions. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.

    Science.gov (United States)

    Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian

    2017-10-10

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  15. A Real-Time Reaction Obstacle Avoidance Algorithm for Autonomous Underwater Vehicles in Unknown Environments.

    Science.gov (United States)

    Yan, Zheping; Li, Jiyun; Zhang, Gengshi; Wu, Yi

    2018-02-02

    A novel real-time reaction obstacle avoidance algorithm (RRA) is proposed for autonomous underwater vehicles (AUVs) that must adapt to unknown complex terrains, based on forward looking sonar (FLS). To accomplish this algorithm, obstacle avoidance rules are planned, and the RRA processes are split into five steps Introduction only lists 4 so AUVs can rapidly respond to various environment obstacles. The largest polar angle algorithm (LPAA) is designed to change detected obstacle's irregular outline into a convex polygon, which simplifies the obstacle avoidance process. A solution is designed to solve the trapping problem existing in U-shape obstacle avoidance by an outline memory algorithm. Finally, simulations in three unknown obstacle scenes are carried out to demonstrate the performance of this algorithm, where the obtained obstacle avoidance trajectories are safety, smooth and near-optimal.

  16. Computer simulation games in population and education.

    Science.gov (United States)

    Moreland, R S

    1988-01-01

    Computer-based simulation games are effective training tools that have several advantages. They enable players to learn in a nonthreatening manner and develop strategies to achieve goals in a dynamic environment. They also provide visual feedback on the effects of players' decisions, encourage players to explore and experiment with options before making final decisions, and develop players' skills in analysis, decision making, and cooperation. 2 games have been developed by the Research Triangle Institute for public-sector planning agencies interested in or dealing with developing countries. The UN Population and Development Game teaches players about the interaction between population variables and the national economy and how population policies complement other national policies, such as education. The BRIDGES Education Planning Game focuses on the effects education has on national policies. In both games, the computer simulates the reactions of a fictional country's socioeconomic system to players' decisions. Players can change decisions after seeing their effects on a computer screen and thus can improve their performance in achieving goals.

  17. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  18. Computational Methods for Conformational Sampling of Biomolecules

    DEFF Research Database (Denmark)

    Bottaro, Sandro

    mathematical approach to a classic geometrical problem in protein simulations, and demonstrated its superiority compared to existing approaches. Secondly, we have constructed a more accurate implicit model of the aqueous environment, which is of fundamental importance in protein chemistry. This model......Proteins play a fundamental role in virtually every process within living organisms. For example, some proteins act as enzymes, catalyzing a wide range of reactions necessary for life, others mediate the cell interaction with the surrounding environment and still others have regulatory functions...... is computationally much faster than models where water molecules are represented explicitly. Finally, in collaboration with the group of structural bioinformatics at the Department of Biology (KU), we have applied these techniques in the context of modeling of protein structure and flexibility from low...

  19. Development of a reaction cell for in-situ/operando studies of surface of a catalyst under a reaction condition and during catalysis

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Luan; Tao, Franklin, E-mail: franklin.tao.2011@gmail.com [Department of Chemistry and Biochemistry, University of Notre Dame, Notre Dame, Indiana 46556 (United States); Department of Chemical and Petroleum Engineering, University of Kansas, Lawrence, Kansas 66045 (United States)

    2016-06-15

    Tracking surface chemistry of a catalyst during catalysis is significant for fundamental understanding of catalytic performance of the catalyst since it allows for establishing an intrinsic correlation between surface chemistry of a catalyst at its working status and its corresponding catalytic performance. Ambient pressure X-ray photoelectron spectroscopy can be used for in-situ studies of surfaces of different materials or devices in a gas. To simulate the gaseous environment of a catalyst in a fixed-bed a flowing gaseous environment of reactants around the catalyst is necessary. Here, we report the development of a new flowing reaction cell for simulating in-situ study of a catalyst surface under a reaction condition in gas of one reactant or during catalysis in a mixture of reactants of a catalytic reaction. The homemade reaction cell is installed in a high vacuum (HV) or ultrahigh vacuum (UHV) environment of a chamber. The flowing gas in the reaction cell is separated from the HV or UHV environment through well sealings at three interfaces between the reaction cell and X-ray window, sample door and aperture of front cone of an energy analyzer. Catalyst in the cell is heated through infrared laser beam introduced through a fiber optics interfaced with the reaction cell through a homemade feedthrough. The highly localized heating on the sample holder and Au-passivated internal surface of the reaction cell effectively minimizes any unwanted reactions potentially catalyzed by the reaction cell. The incorporated laser heating allows a fast heating and a high thermal stability of the sample at a high temperature. With this cell, a catalyst at 800 °C in a flowing gas can be tracked readily.

  20. Development of a reaction cell for in-situ/operando studies of surface of a catalyst under a reaction condition and during catalysis

    International Nuclear Information System (INIS)

    Nguyen, Luan; Tao, Franklin

    2016-01-01

    Tracking surface chemistry of a catalyst during catalysis is significant for fundamental understanding of catalytic performance of the catalyst since it allows for establishing an intrinsic correlation between surface chemistry of a catalyst at its working status and its corresponding catalytic performance. Ambient pressure X-ray photoelectron spectroscopy can be used for in-situ studies of surfaces of different materials or devices in a gas. To simulate the gaseous environment of a catalyst in a fixed-bed a flowing gaseous environment of reactants around the catalyst is necessary. Here, we report the development of a new flowing reaction cell for simulating in-situ study of a catalyst surface under a reaction condition in gas of one reactant or during catalysis in a mixture of reactants of a catalytic reaction. The homemade reaction cell is installed in a high vacuum (HV) or ultrahigh vacuum (UHV) environment of a chamber. The flowing gas in the reaction cell is separated from the HV or UHV environment through well sealings at three interfaces between the reaction cell and X-ray window, sample door and aperture of front cone of an energy analyzer. Catalyst in the cell is heated through infrared laser beam introduced through a fiber optics interfaced with the reaction cell through a homemade feedthrough. The highly localized heating on the sample holder and Au-passivated internal surface of the reaction cell effectively minimizes any unwanted reactions potentially catalyzed by the reaction cell. The incorporated laser heating allows a fast heating and a high thermal stability of the sample at a high temperature. With this cell, a catalyst at 800 °C in a flowing gas can be tracked readily.

  1. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach.

  2. PABLM: a computer program to calculate accumulated radiation doses from radionuclides in the environment

    International Nuclear Information System (INIS)

    Napier, B.A.; Kennedy, W.E. Jr.; Soldat, J.K.

    1980-03-01

    A computer program, PABLM, was written to facilitate the calculation of internal radiation doses to man from radionuclides in food products and external radiation doses from radionuclides in the environment. This report contains details of mathematical models used and calculational procedures required to run the computer program. Radiation doses from radionuclides in the environment may be calculated from deposition on the soil or plants during an atmospheric or liquid release, or from exposure to residual radionuclides in the environment after the releases have ended. Radioactive decay is considered during the release of radionuclides, after they are deposited on the plants or ground, and during holdup of food after harvest. The radiation dose models consider several exposure pathways. Doses may be calculated for either a maximum-exposed individual or for a population group. The doses calculated are accumulated doses from continuous chronic exposure. A first-year committed dose is calculated as well as an integrated dose for a selected number of years. The equations for calculating internal radiation doses are derived from those given by the International Commission on Radiological Protection (ICRP) for body burdens and MPC's of each radionuclide. The radiation doses from external exposure to contaminated water and soil are calculated using the basic assumption that the contaminated medium is large enough to be considered an infinite volume or plane relative to the range of the emitted radiations. The equations for calculations of the radiation dose from external exposure to shoreline sediments include a correction for the finite width of the contaminated beach

  3. The Influence of Trainee Gaming Experience and Computer Self-Efficacy on Learner Outcomes of Videogame-Based Learning Environments

    National Research Council Canada - National Science Library

    Orvis, Karin A; Orvis, Kara L; Belanich, James; Mullin, Laura N

    2005-01-01

    .... The purpose of the current research was to investigate the influence of two trainee characteristics, prior videogame experience and computer self-efficacy, on learner outcomes of a videogame-based training environment...

  4. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    Science.gov (United States)

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at

  5. Aerosol transport simulations in indoor and outdoor environments using computational fluid dynamics (CFD)

    Science.gov (United States)

    Landazuri, Andrea C.

    This dissertation focuses on aerosol transport modeling in occupational environments and mining sites in Arizona using computational fluid dynamics (CFD). The impacts of human exposure in both environments are explored with the emphasis on turbulence, wind speed, wind direction and particle sizes. Final emissions simulations involved the digitalization process of available elevation contour plots of one of the mining sites to account for realistic topographical features. The digital elevation map (DEM) of one of the sites was imported to COMSOL MULTIPHYSICSRTM for subsequent turbulence and particle simulations. Simulation results that include realistic topography show considerable deviations of wind direction. Inter-element correlation results using metal and metalloid size resolved concentration data using a Micro-Orifice Uniform Deposit Impactor (MOUDI) under given wind speeds and directions provided guidance on groups of metals that coexist throughout mining activities. Groups between Fe-Mg, Cr-Fe, Al-Sc, Sc-Fe, and Mg-Al are strongly correlated for unrestricted wind directions and speeds, suggesting that the source may be of soil origin (e.g. ore and tailings); also, groups of elements where Cu is present, in the coarse fraction range, may come from mechanical action mining activities and saltation phenomenon. Besides, MOUDI data under low wind speeds (Computational Fluid Dynamics can be used as a source apportionment tool to identify areas that have an effect over specific sampling points and susceptible regions under certain meteorological conditions, and these conclusions can be supported with inter-element correlation matrices and lead isotope analysis, especially since there is limited access to the mining sites. Additional results concluded that grid adaption is a powerful tool that allows to refine specific regions that require lots of detail and therefore better resolve flow detail, provides higher number of locations with monotonic convergence than the

  6. Chemistry in interstellar space. [environment characteristics influencing reaction dynamics

    Science.gov (United States)

    Donn, B.

    1973-01-01

    The particular characteristics of chemistry in interstellar space are determined by the unique environmental conditions involved. Interstellar matter is present at extremely low densities. Large deviations from thermodynamic equilibrium are, therefore, to be expected. A relatively intense ultraviolet radiation is present in many regions. The temperatures are in the range from 5 to 200 K. Data concerning the inhibiting effect of small activation energies in interstellar clouds are presented in a table. A summary of measured activation energies or barrier heights for exothermic exchange reactions is also provided. Problems of molecule formation are discussed, taking into account gas phase reactions and surface catalyzed processes.

  7. Efficient and Adaptive Methods for Computing Accurate Potential Surfaces for Quantum Nuclear Effects: Applications to Hydrogen-Transfer Reactions.

    Science.gov (United States)

    DeGregorio, Nicole; Iyengar, Srinivasan S

    2018-01-09

    We present two sampling measures to gauge critical regions of potential energy surfaces. These sampling measures employ (a) the instantaneous quantum wavepacket density, an approximation to the (b) potential surface, its (c) gradients, and (d) a Shannon information theory based expression that estimates the local entropy associated with the quantum wavepacket. These four criteria together enable a directed sampling of potential surfaces that appears to correctly describe the local oscillation frequencies, or the local Nyquist frequency, of a potential surface. The sampling functions are then utilized to derive a tessellation scheme that discretizes the multidimensional space to enable efficient sampling of potential surfaces. The sampled potential surface is then combined with four different interpolation procedures, namely, (a) local Hermite curve interpolation, (b) low-pass filtered Lagrange interpolation, (c) the monomial symmetrization approximation (MSA) developed by Bowman and co-workers, and (d) a modified Shepard algorithm. The sampling procedure and the fitting schemes are used to compute (a) potential surfaces in highly anharmonic hydrogen-bonded systems and (b) study hydrogen-transfer reactions in biogenic volatile organic compounds (isoprene) where the transferring hydrogen atom is found to demonstrate critical quantum nuclear effects. In the case of isoprene, the algorithm discussed here is used to derive multidimensional potential surfaces along a hydrogen-transfer reaction path to gauge the effect of quantum-nuclear degrees of freedom on the hydrogen-transfer process. Based on the decreased computational effort, facilitated by the optimal sampling of the potential surfaces through the use of sampling functions discussed here, and the accuracy of the associated potential surfaces, we believe the method will find great utility in the study of quantum nuclear dynamics problems, of which application to hydrogen-transfer reactions and hydrogen

  8. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  9. Detecting and Understanding the Impact of Cognitive and Interpersonal Conflict in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Prata, David Nadler; Baker, Ryan S. J. d.; Costa, Evandro d. B.; Rose, Carolyn P.; Cui, Yue; de Carvalho, Adriana M. J. B.

    2009-01-01

    This paper presents a model which can automatically detect a variety of student speech acts as students collaborate within a computer supported collaborative learning environment. In addition, an analysis is presented which gives substantial insight as to how students' learning is associated with students' speech acts, knowledge that will…

  10. SkyNet: A Modular Nuclear Reaction Network Library

    Science.gov (United States)

    Lippuner, Jonas; Roberts, Luke F.

    2017-12-01

    Almost all of the elements heavier than hydrogen that are present in our solar system were produced by nuclear burning processes either in the early universe or at some point in the life cycle of stars. In all of these environments, there are dozens to thousands of nuclear species that interact with each other to produce successively heavier elements. In this paper, we present SkyNet, a new general-purpose nuclear reaction network that evolves the abundances of nuclear species under the influence of nuclear reactions. SkyNet can be used to compute the nucleosynthesis evolution in all astrophysical scenarios where nucleosynthesis occurs. SkyNet is free and open source, and aims to be easy to use and flexible. Any list of isotopes can be evolved, and SkyNet supports different types of nuclear reactions. SkyNet is modular so that new or existing physics, like nuclear reactions or equations of state, can easily be added or modified. Here, we present in detail the physics implemented in SkyNet with a focus on a self-consistent transition to and from nuclear statistical equilibrium to non-equilibrium nuclear burning, our implementation of electron screening, and coupling of the network to an equation of state. We also present comprehensive code tests and comparisons with existing nuclear reaction networks. We find that SkyNet agrees with published results and other codes to an accuracy of a few percent. Discrepancies, where they exist, can be traced to differences in the physics implementations.

  11. An experimental and theoretical study of reaction steps relevant to the methanol-to-hydrocarbons reaction

    Energy Technology Data Exchange (ETDEWEB)

    Svelle, Stian

    2004-07-01

    The primary objective of the present work is to obtain new insight into the reaction mechanism of the zeolite catalyzed methanol-to-hydrocarbons (MTH) reaction. It was decided to use both experimental and computational techniques to reach this goal. An investigation of the n-butene + methanol system was therefore initiated. Over time, it became apparent that it was possible to determine the rate for the methylation of n-butene by methanol. The ethene and propene systems were therefore reexamined in order to collect kinetic information also for those cases. With the development of user-friendly quantum chemistry programs such as the Gaussian suite of programs, the possibility of applying quantum chemical methods to many types of problems has become readily available even for non-experts. When performing mechanistic studies, there is quite often a considerable synergy effect when combining experimental and computational approaches. The methylation reactions mentioned above turned out to be an issue well suited for quantum chemical investigations. The incentive for examining the halomethane reactivity was the clear analogy to the MTH reaction system. Alkene dimerization was also a reaction readily examined with quantum chemistry. As discussed in the introduction of this thesis, polymethylbenzenes, or their cationic counterparts, are suspected to be key intermediates in the MTH reaction. It was therefore decided to investigate the intrinsic reactivity of these species in the gas-phase by employing sophisticated mass spectrometric (MS) techniques in collaboration with the MS group at the Department of Chemistry, University of Oslo The data thus obtained will also be compared with results from an ongoing computational study on gas phase polymethylbenzenium reactivity. 6 papers presenting various studies are included. The titles are: 1) A Theoretical Investigation of the Methylation of Alkenes with Methanol over Acidic Zeolites. 2) A Theoretical Investigation of the

  12. Characterization of Aerodynamic Interactions with the Mars Science Laboratory Reaction Control System Using Computation and Experiment

    Science.gov (United States)

    Schoenenberger, Mark; VanNorman, John; Rhode, Matthew; Paulson, John

    2013-01-01

    On August 5 , 2012, the Mars Science Laboratory (MSL) entry capsule successfully entered Mars' atmosphere and landed the Curiosity rover in Gale Crater. The capsule used a reaction control system (RCS) consisting of four pairs of hydrazine thrusters to fly a guided entry. The RCS provided bank control to fly along a flight path commanded by an onboard computer and also damped unwanted rates due to atmospheric disturbances and any dynamic instabilities of the capsule. A preliminary assessment of the MSL's flight data from entry showed that the capsule flew much as predicted. This paper will describe how the MSL aerodynamics team used engineering analyses, computational codes and wind tunnel testing in concert to develop the RCS system and certify it for flight. Over the course of MSL's development, the RCS configuration underwent a number of design iterations to accommodate mechanical constraints, aeroheating concerns and excessive aero/RCS interactions. A brief overview of the MSL RCS configuration design evolution is provided. Then, a brief description is presented of how the computational predictions of RCS jet interactions were validated. The primary work to certify that the RCS interactions were acceptable for flight was centered on validating computational predictions at hypersonic speeds. A comparison of computational fluid dynamics (CFD) predictions to wind tunnel force and moment data gathered in the NASA Langley 31-Inch Mach 10 Tunnel was the lynch pin to validating the CFD codes used to predict aero/RCS interactions. Using the CFD predictions and experimental data, an interaction model was developed for Monte Carlo analyses using 6-degree-of-freedom trajectory simulation. The interaction model used in the flight simulation is presented.

  13. Modelling combustion reactions for gas flaring and its resulting emissions

    Directory of Open Access Journals (Sweden)

    O. Saheed Ismail

    2016-07-01

    Full Text Available Flaring of associated petroleum gas is an age long environmental concern which remains unabated. Flaring of gas maybe a very efficient combustion process especially steam/air assisted flare and more economical than utilization in some oil fields. However, it has serious implications for the environment. This study considered different reaction types and operating conditions for gas flaring. Six combustion equations were generated using the mass balance concept with varying air and combustion efficiency. These equations were coded with a computer program using 12 natural gas samples of different chemical composition and origin to predict the pattern of emission species from gas flaring. The effect of key parameters on the emission output is also shown. CO2, CO, NO, NO2 and SO2 are the anticipated non-hydrocarbon emissions of environmental concern. Results show that the quantity and pattern of these chemical species depended on percentage excess/deficiency of stoichiometric air, natural gas type, reaction type, carbon mass content, impurities, combustion efficiency of the flare system etc. These emissions degrade the environment and human life, so knowing the emission types, pattern and flaring conditions that this study predicts is of paramount importance to governments, environmental agencies and the oil and gas industry.

  14. Performative Environments

    DEFF Research Database (Denmark)

    Thomsen, Bo Stjerne

    2008-01-01

    The paper explores how performative architecture can act as a collective environment localizing urban flows and establishing public domains through the integration of pervasive computing and animation techniques. The NoRA project introduces the concept of ‘performative environments,' focusing on ...... of local interactions and network behaviour, building becomes social infrastructure and prompts an understanding of architectural structures as quasiobjects, which can retain both variation and recognisability in changing social constellations.......The paper explores how performative architecture can act as a collective environment localizing urban flows and establishing public domains through the integration of pervasive computing and animation techniques. The NoRA project introduces the concept of ‘performative environments,' focusing...

  15. E-pharmacovigilance: development and implementation of a computable knowledge base to identify adverse drug reactions.

    Science.gov (United States)

    Neubert, Antje; Dormann, Harald; Prokosch, Hans-Ulrich; Bürkle, Thomas; Rascher, Wolfgang; Sojer, Reinhold; Brune, Kay; Criegee-Rieck, Manfred

    2013-09-01

    Computer-assisted signal generation is an important issue for the prevention of adverse drug reactions (ADRs). However, due to poor standardization of patients' medical data and a lack of computable medical drug knowledge the specificity of computerized decision support systems for early ADR detection is too low and thus those systems are not yet implemented in daily clinical practice. We report on a method to formalize knowledge about ADRs based on the Summary of Product Characteristics (SmPCs) and linking them with structured patient data to generate safety signals automatically and with high sensitivity and specificity. A computable ADR knowledge base (ADR-KB) that inherently contains standardized concepts for ADRs (WHO-ART), drugs (ATC) and laboratory test results (LOINC) was built. The system was evaluated in study populations of paediatric and internal medicine inpatients. A total of 262 different ADR concepts related to laboratory findings were linked to 212 LOINC terms. The ADR knowledge base was retrospectively applied to a study population of 970 admissions (474 internal and 496 paediatric patients), who underwent intensive ADR surveillance. The specificity increased from 7% without ADR-KB up to 73% in internal patients and from 19.6% up to 91% in paediatric inpatients, respectively. This study shows that contextual linkage of patients' medication data with laboratory test results is a useful and reasonable instrument for computer-assisted ADR detection and a valuable step towards a systematic drug safety process. The system enables automated detection of ADRs during clinical practice with a quality close to intensive chart review. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  16. Flows and chemical reactions in heterogeneous mixtures

    CERN Document Server

    Prud'homme, Roger

    2014-01-01

    This book - a sequel of previous publications 'Flows and Chemical Reactions' and 'Chemical Reactions in Flows and Homogeneous Mixtures' - is devoted to flows with chemical reactions in heterogeneous environments.  Heterogeneous media in this volume include interfaces and lines. They may be the site of radiation. Each type of flow is the subject of a chapter in this volume. We consider first, in Chapter 1, the question of the generation of environments biphasic individuals: dusty gas, mist, bubble flow.  Chapter 2 is devoted to the study at the mesoscopic scale: particle-fluid exchange of mom

  17. Implementing interactive computing in an object-oriented environment

    Directory of Open Access Journals (Sweden)

    Frederic Udina

    2000-04-01

    Full Text Available Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control of computational flow to ensure that only strictly required computations are actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.

  18. Hydrogen production from water gas shift reaction in a high gravity (Higee) environment using a rotating packed bed

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Wei-Hsin; Syu, Yu-Jhih [Department of Greenergy, National University of Tainan, Tainan 700 (China)

    2010-10-15

    Hydrogen production via the water gas shift reaction (WGSR) was investigated in a high gravity environment. A rotating packed bed (RPB) reactor containing a Cu-Zn catalyst and spinning in the range of 0-1800 rpm was used to create high centrifugal force. The reaction temperature and the steam/CO ratio ranged from 250 to 350 C and 2 to 8, respectively. A dimensionless parameter, the G number, was derived to account for the effect of centrifugal force on the enhancement of the WGSR. With the rotor speed of 1800 rpm, the induced centrifugal force acting on the reactants was as high as 234 g on average in the RPB. As a result, the CO conversion from the WGSR was increased up to 70% compared to that without rotation. This clearly revealed that the centrifugal force was conducive to hydrogen production, resulting from intensifying mass transfer and elongating the path of the reactants in the catalyst bed. From Le Chatelier's principle, a higher reaction temperature or a lower steam/CO ratio disfavors CO conversion; however, under such a situation the enhancement of the centrifugal force on hydrogen production from the WGSR tended to become more significant. Accordingly, a correlation between the enhancement of CO conversion and the G number was established. As a whole, the higher the reaction temperature and the lower the steam/CO ratio, the higher the exponent of the G number function and the better the centrifugal force on the WGSR. (author)

  19. Pericyclic reactions in an aqueous molecular flask.

    Science.gov (United States)

    Murase, Takashi; Fujita, Makoto

    2010-10-01

    A self-assembled molecular flask with a nanometer-sized restricted cavity offers a new reaction environment that is quite different from the bulk solution. The self-assembled cage accommodates a pair of hydrophobic molecules to perform unusual Diels-Alder reactions and [2+2] photoadditions of otherwise unreactive aromatic molecules. In this cage, for example, the Diels-Alder reaction of naphthalene proceeds smoothly under mild conditions, and aceanthrylene shows reactivity for both [2+2] and [2+4] cycloadditions via the identical ternary host-guest complex. The observed greatly enhanced reactivity stems from the increased local concentration and pre-organization of the substrate pair within the cage, which reduces the entropic cost and switches the reaction profile from a bimolecular to a pseudo-intramolecular reaction pathway. The reinforced orientation and arrangement of substrate pairs specify regio- and stereo-selectivities of the subsequent reactions in the cavity. Chiral auxiliaries outside the cage create the inner chiral environment and induce asymmetric reactions inside the cage (up to 50% ee). © 2010 The Japan Chemical Journal Forum and Wiley Periodicals, Inc.

  20. Validating the Accuracy of Reaction Time Assessment on Computer-Based Tablet Devices.

    Science.gov (United States)

    Schatz, Philip; Ybarra, Vincent; Leitner, Donald

    2015-08-01

    Computer-based assessment has evolved to tablet-based devices. Despite the availability of tablets and "apps," there is limited research validating their use. We documented timing delays between stimulus presentation and (simulated) touch response on iOS devices (3rd- and 4th-generation Apple iPads) and Android devices (Kindle Fire, Google Nexus, Samsung Galaxy) at response intervals of 100, 250, 500, and 1,000 milliseconds (ms). Results showed significantly greater timing error on Google Nexus and Samsung tablets (81-97 ms), than Kindle Fire and Apple iPads (27-33 ms). Within Apple devices, iOS 7 obtained significantly lower timing error than iOS 6. Simple reaction time (RT) trials (250 ms) on tablet devices represent 12% to 40% error (30-100 ms), depending on the device, which decreases considerably for choice RT trials (3-5% error at 1,000 ms). Results raise implications for using the same device for serial clinical assessment of RT using tablets, as well as the need for calibration of software and hardware. © The Author(s) 2015.

  1. Sleeve reaction chamber system

    Science.gov (United States)

    Northrup, M Allen [Berkeley, CA; Beeman, Barton V [San Mateo, CA; Benett, William J [Livermore, CA; Hadley, Dean R [Manteca, CA; Landre, Phoebe [Livermore, CA; Lehew, Stacy L [Livermore, CA; Krulevitch, Peter A [Pleasanton, CA

    2009-08-25

    A chemical reaction chamber system that combines devices such as doped polysilicon for heating, bulk silicon for convective cooling, and thermoelectric (TE) coolers to augment the heating and cooling rates of the reaction chamber or chambers. In addition the system includes non-silicon-based reaction chambers such as any high thermal conductivity material used in combination with a thermoelectric cooling mechanism (i.e., Peltier device). The heat contained in the thermally conductive part of the system can be used/reused to heat the device, thereby conserving energy and expediting the heating/cooling rates. The system combines a micromachined silicon reaction chamber, for example, with an additional module/device for augmented heating/cooling using the Peltier effect. This additional module is particularly useful in extreme environments (very hot or extremely cold) where augmented heating/cooling would be useful to speed up the thermal cycling rates. The chemical reaction chamber system has various applications for synthesis or processing of organic, inorganic, or biochemical reactions, including the polymerase chain reaction (PCR) and/or other DNA reactions, such as the ligase chain reaction.

  2. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  3. The Use of Engineering Design Concept for Computer Programming Course: A Model of Blended Learning Environment

    Science.gov (United States)

    Tritrakan, Kasame; Kidrakarn, Pachoen; Asanok, Manit

    2016-01-01

    The aim of this research is to develop a learning model which blends factors from learning environment and engineering design concept for learning in computer programming course. The usage of the model was also analyzed. This study presents the design, implementation, and evaluation of the model. The research methodology is divided into three…

  4. Thermodynamic criteria for estimating the kinetic parameters of catalytic reactions

    Science.gov (United States)

    Mitrichev, I. I.; Zhensa, A. V.; Kol'tsova, E. M.

    2017-01-01

    Kinetic parameters are estimated using two criteria in addition to the traditional criterion that considers the consistency between experimental and modeled conversion data: thermodynamic consistency and the consistency with entropy production (i.e., the absolute rate of the change in entropy due to exchange with the environment is consistent with the rate of entropy production in the steady state). A special procedure is developed and executed on a computer to achieve the thermodynamic consistency of a set of kinetic parameters with respect to both the standard entropy of a reaction and the standard enthalpy of a reaction. A problem of multi-criterion optimization, reduced to a single-criterion problem by summing weighted values of the three criteria listed above, is solved. Using the reaction of NO reduction with CO on a platinum catalyst as an example, it is shown that the set of parameters proposed by D.B. Mantri and P. Aghalayam gives much worse agreement with experimental values than the set obtained on the basis of three criteria: the sum of the squares of deviations for conversion, the thermodynamic consistency, and the consistency with entropy production.

  5. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  6. Distributed computing environments for future space control systems

    Science.gov (United States)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  7. Detecting Sybil Attacks in Cloud Computing  Environments Based on Fail‐Stop Signature

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2017-03-01

    Full Text Available Due to the loosely coupled property of cloud computing environments, no node has complete knowledge of the system. For this reason, detecting a Sybil attack in cloud computing environments is a non‐trivial task. In such a dynamic system, the use of algorithms based on tree or ring structures for collecting the global state of the system has unfortunate downsides, that is, the structure should be re‐constructed in the presence of node joining and leaving. In this paper, we propose an unstructured Sybil attack detection algorithm in cloud computing environments. Our proposed algorithm uses one‐to‐one communication primitives rather than broadcast primitives and, therefore, the message complexity can be reduced. In our algorithmic design, attacker nodes forging multiple identities are effectively detected by normal nodes with the fail‐stop signature scheme. We show that, regardless of the number of attacker nodes, our Sybil attack detection algorithm is able to reach consensus.

  8. Thermonuclear reaction listing

    International Nuclear Information System (INIS)

    Fukai, Yuzo

    1993-01-01

    The following 10 elements, including T, are well known as nuclear fusion fuels: p, D, T, 3 He, 4 He, 6 Li, 7 Li, 9 Be, 10 B, 11 B, ( 12 C, 13 C), where 12 C and 13 C are considered only in the calculation of Q value. Accordingly the number of the thermonuclear reactions is 55, and 78, if including carbon elements. The reactions have some branches. For the branches having two and three reaction products, the reaction products, Q value and threshold energy are calculated by using a computer. We have investigated those of the branches having more than three products from the papers of Ajzenberg-Selove and so on. And also, by the same papers, we check whether the above mentioned branch has been observed or not. The results are as follows: (I) the number of reactions which have Q 0 branches only with γ ray production, and Q 0 and neutron production is 36(17), and (IV) that of reactions whose branch with Q > 0 does not produce neutrons is 9(3). The value in the parentheses shows the number of the case of the carbon elements. For 55 thermonuclear reactions induced by lighter nuclides than 11 B, the reaction products, the values of Q and threshold energy, and the papers with reaction cross section data are presented in the tables. (author)

  9. Virtual reality exposure treatment of agoraphobia: a comparison of computer automatic virtual environment and head-mounted display

    NARCIS (Netherlands)

    Meyerbröker, K.; Morina, N.; Kerkhof, G.; Emmelkamp, P.M.G.; Wiederhold, B.K.; Bouchard, S.; Riva, G.

    2011-01-01

    In this study the effects of virtual reality exposure therapy (VRET) were investigated in patients with panic disorder and agoraphobia. The level of presence in VRET was compared between using either a head-mounted display (HMD) or a computer automatic virtual environment (CAVE). Results indicate

  10. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    International Nuclear Information System (INIS)

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  11. Probability-Based Determination Methods for Service Waiting in Service-Oriented Computing Environments

    Science.gov (United States)

    Zeng, Sen; Huang, Shuangxi; Liu, Yang

    Cooperative business processes (CBP)-based service-oriented enterprise networks (SOEN) are emerging with the significant advances of enterprise integration and service-oriented architecture. The performance prediction and optimization for CBP-based SOEN is very complex. To meet these challenges, one of the key points is to try to reduce an abstract service’s waiting number of its physical services. This paper introduces a probability-based determination method (PBDM) of an abstract service’ waiting number, M l , and time span, τ i , for its physical services. The determination of M i and τ i is according to the physical services’ arriving rule and their overall performance’s distribution functions. In PBDM, the arriving probability of the physical services with the best overall performance value is a pre-defined reliability. PBDM has made use of the information of the physical services’ arriving rule and performance distribution functions thoroughly, which will improve the computational efficiency for the scheme design and performance optimization of the collaborative business processes in service-oriented computing environments.

  12. Hybrid quantum and classical methods for computing kinetic isotope effects of chemical reactions in solutions and in enzymes.

    Science.gov (United States)

    Gao, Jiali; Major, Dan T; Fan, Yao; Lin, Yen-Lin; Ma, Shuhua; Wong, Kin-Yiu

    2008-01-01

    A method for incorporating quantum mechanics into enzyme kinetics modeling is presented. Three aspects are emphasized: 1) combined quantum mechanical and molecular mechanical methods are used to represent the potential energy surface for modeling bond forming and breaking processes, 2) instantaneous normal mode analyses are used to incorporate quantum vibrational free energies to the classical potential of mean force, and 3) multidimensional tunneling methods are used to estimate quantum effects on the reaction coordinate motion. Centroid path integral simulations are described to make quantum corrections to the classical potential of mean force. In this method, the nuclear quantum vibrational and tunneling contributions are not separable. An integrated centroid path integral-free energy perturbation and umbrella sampling (PI-FEP/UM) method along with a bisection sampling procedure was summarized, which provides an accurate, easily convergent method for computing kinetic isotope effects for chemical reactions in solution and in enzymes. In the ensemble-averaged variational transition state theory with multidimensional tunneling (EA-VTST/MT), these three aspects of quantum mechanical effects can be individually treated, providing useful insights into the mechanism of enzymatic reactions. These methods are illustrated by applications to a model process in the gas phase, the decarboxylation reaction of N-methyl picolinate in water, and the proton abstraction and reprotonation process catalyzed by alanine racemase. These examples show that the incorporation of quantum mechanical effects is essential for enzyme kinetics simulations.

  13. Long-term changes of information environments and computer anxiety of nurse administrators in Japan.

    Science.gov (United States)

    Majima, Yukie; Izumi, Takako

    2013-01-01

    In Japan, medical information systems, including electronic medical records, are being introduced increasingly at medical and nursing fields. Nurse administrators, who are involved in the introduction of medical information systems and who must make proper judgment, are particularly required to have at least minimal knowledge of computers and networks and the ability to think about easy-to-use medical information systems. However, few of the current generation of nurse administrators studied information science subjects in their basic education curriculum. It can be said that information education for nurse administrators has become a pressing issue. Consequently, in this study, we conducted a survey of participants taking the first level program of the education course for Japanese certified nurse administrators to ascertain the actual conditions, such as the information environments that nurse administrators are in, their anxiety attitude to computers. Comparisons over the seven years since 2004 revealed that although introduction of electronic medical records in hospitals was progressing, little change in attributes of participants taking the course was observed, such as computer anxiety.

  14. RSSI-Based Distance Estimation Framework Using a Kalman Filter for Sustainable Indoor Computing Environments

    Directory of Open Access Journals (Sweden)

    Yunsick Sung

    2016-11-01

    Full Text Available Given that location information is the key to providing a variety of services in sustainable indoor computing environments, it is required to obtain accurate locations. Locations can be estimated by three distances from three fixed points. Therefore, if the distance between two points can be measured or estimated accurately, the location in indoor environments can be estimated. To increase the accuracy of the measured distance, noise filtering, signal revision, and distance estimation processes are generally performed. This paper proposes a novel framework for estimating the distance between a beacon and an access point (AP in a sustainable indoor computing environment. Diverse types of received strength signal indications (RSSIs are used for WiFi, Bluetooth, and radio signals, and the proposed distance estimation framework is unique in that it is independent of the specific wireless signal involved, being based on the Bluetooth signal of the beacon. Generally, RSSI measurement, noise filtering, and revision are required for distance estimation using RSSIs. The employed RSSIs are first measured from an AP, with multiple APs sometimes used to increase the accuracy of the distance estimation. Owing to the inevitable presence of noise in the measured RSSIs, the application of noise filtering is essential, and further revision is used to address the inaccuracy and instability that characterizes RSSIs measured in an indoor environment. The revised RSSIs are then used to estimate the distance. The proposed distance estimation framework uses one AP to measure the RSSIs, a Kalman filter to eliminate noise, and a log-distance path loss model to revise the measured RSSIs. In the experimental implementation of the framework, both a RSSI filter and a Kalman filter were respectively used for noise elimination to comparatively evaluate the performance of the latter for the specific application. The Kalman filter was found to reduce the accumulated errors by 8

  15. The nuclear reaction model code MEDICUS

    International Nuclear Information System (INIS)

    Ibishia, A.I.

    2008-01-01

    The new computer code MEDICUS has been used to calculate cross sections of nuclear reactions. The code, implemented in MATLAB 6.5, Mathematica 5, and Fortran 95 programming languages, can be run in graphical and command line mode. Graphical User Interface (GUI) has been built that allows the user to perform calculations and to plot results just by mouse clicking. The MS Windows XP and Red Hat Linux platforms are supported. MEDICUS is a modern nuclear reaction code that can compute charged particle-, photon-, and neutron-induced reactions in the energy range from thresholds to about 200 MeV. The calculation of the cross sections of nuclear reactions are done in the framework of the Exact Many-Body Nuclear Cluster Model (EMBNCM), Direct Nuclear Reactions, Pre-equilibrium Reactions, Optical Model, DWBA, and Exciton Model with Cluster Emission. The code can be used also for the calculation of nuclear cluster structure of nuclei. We have calculated nuclear cluster models for some nuclei such as 177 Lu, 90 Y, and 27 Al. It has been found that nucleus 27 Al can be represented through the two different nuclear cluster models: 25 Mg + d and 24 Na + 3 He. Cross sections in function of energy for the reaction 27 Al( 3 He,x) 22 Na, established as a production method of 22 Na, are calculated by the code MEDICUS. Theoretical calculations of cross sections are in good agreement with experimental results. Reaction mechanisms are taken into account. (author)

  16. Do Social Computing Make You Happy? A Case Study of Nomadic Children in Mixed Environments

    DEFF Research Database (Denmark)

    Christensen, Bent Guldbjerg

    2005-01-01

    In this paper I describe a perspective on ambient, ubiquitous, and pervasive computing called the happiness perspective. By using the happiness perspective, the application domain and how the technology is used and experienced, becomes a central and integral part of perceiving ambient technology....... will use the perspective in a case study on field test experiments with nomadic children in mixed environments using the eBag system....

  17. Modeling and Analysis Compute Environments, Utilizing Virtualization Technology in the Climate and Earth Systems Science domain

    Science.gov (United States)

    Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.

    2010-12-01

    Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.

  18. Educational Game Design. Bridging the gab between computer based learning and experimental learning environments

    DEFF Research Database (Denmark)

    Andersen, Kristine

    2007-01-01

    Considering the rapidly growing amount of digital educational materials only few of them bridge the gab between experimental learning environments and computer based learning environments (Gardner, 1991). Observations from two cases in primary school and lower secondary school in the subject...... with a prototype of a MOO storyline. The aim of the MOO storyline is to challenge the potential of dialogue, user involvement, and learning responsibility and to use the children?s natural curiosity and motivation for game playing, especially when digital games involves other children. The paper proposes a model......, based on the narrative approach for experimental learning subjects, relying on ideas from Csikszentmihalyis notion of flow (Csikszentmihalyi, 1991), storyline-pedagogy (Meldgaard, 1994) and ideas from Howard Gardner (Gardner, 1991). The model forms the basis for educational games to be used in home...

  19. Young children reorient by computing layout geometry, not by matching images of the environment.

    Science.gov (United States)

    Lee, Sang Ah; Spelke, Elizabeth S

    2011-02-01

    Disoriented animals from ants to humans reorient in accord with the shape of the surrounding surface layout: a behavioral pattern long taken as evidence for sensitivity to layout geometry. Recent computational models suggest, however, that the reorientation process may not depend on geometrical analyses but instead on the matching of brightness contours in 2D images of the environment. Here we test this suggestion by investigating young children's reorientation in enclosed environments. Children reoriented by extremely subtle geometric properties of the 3D layout: bumps and ridges that protruded only slightly off the floor, producing edges with low contrast. Moreover, children failed to reorient by prominent brightness contours in continuous layouts with no distinctive 3D structure. The findings provide evidence that geometric layout representations support children's reorientation.

  20. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    2017-10-01

    Full Text Available The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  1. Parallel Computing Characteristics of CUPID code under MPI and Hybrid environment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Ryong; Yoon, Han Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jeon, Byoung Jin; Choi, Hyoung Gwon [Seoul National Univ. of Science and Technology, Seoul (Korea, Republic of)

    2014-05-15

    simulation with diagonal preconditioning shows the better speedup. The MPI library was used for node-to-node communication among partitioned subdomains, and the OpenMP threads were activated in every single node using multi-core computing environment. The results of hybrid computing show good performance comparing the pure MPI parallel computing.

  2. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  3. Connecting localized DNA strand displacement reactions

    Science.gov (United States)

    Mullor Ruiz, Ismael; Arbona, Jean-Michel; Lad, Amitkumar; Mendoza, Oscar; Aimé, Jean-Pierre; Elezgaray, Juan

    2015-07-01

    Logic circuits based on DNA strand displacement reactions have been shown to be versatile enough to compute the square root of four-bit numbers. The implementation of these circuits as a set of bulk reactions faces difficulties which include leaky reactions and intrinsically slow, diffusion-limited reaction rates. In this paper, we consider simple examples of these circuits when they are attached to platforms (DNA origamis). As expected, constraining distances between DNA strands leads to faster reaction rates. However, it also induces side-effects that are not detectable in the solution-phase version of this circuitry. Appropriate design of the system, including protection and asymmetry between input and fuel strands, leads to a reproducible behaviour, at least one order of magnitude faster than the one observed under bulk conditions.Logic circuits based on DNA strand displacement reactions have been shown to be versatile enough to compute the square root of four-bit numbers. The implementation of these circuits as a set of bulk reactions faces difficulties which include leaky reactions and intrinsically slow, diffusion-limited reaction rates. In this paper, we consider simple examples of these circuits when they are attached to platforms (DNA origamis). As expected, constraining distances between DNA strands leads to faster reaction rates. However, it also induces side-effects that are not detectable in the solution-phase version of this circuitry. Appropriate design of the system, including protection and asymmetry between input and fuel strands, leads to a reproducible behaviour, at least one order of magnitude faster than the one observed under bulk conditions. Electronic supplementary information (ESI) available. See DOI: 10.1039/C5NR02434J

  4. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  5. Aqueous complexation, precipitation, and adsorption reactions of cadmium in the geologic environment

    International Nuclear Information System (INIS)

    Zachara, J.M.; Rai, D.; Felmy, A.R.; Cowan, C.E.; Smith, S.C.; Moore, D.A.; Resch, C.T.

    1992-06-01

    This report contains new laboratory data and equilibrium constants for important solubility and adsorption reactions of Cd that occur in soil and groundwater and attenuate Cd migration. In addition, extensive interaction experiments with Cd and soils from electric utility sites are described. These experiments show the importance of precipitation and adsorption reactions in soil and demonstrate how such reactions can be modeled to predict Cd attenuation near utility sites

  6. A Novel Computational Method to Reduce Leaky Reaction in DNA Strand Displacement

    Directory of Open Access Journals (Sweden)

    Xin Li

    2015-01-01

    Full Text Available DNA strand displacement technique is widely used in DNA programming, DNA biosensors, and gene analysis. In DNA strand displacement, leaky reactions can cause DNA signals decay and detecting DNA signals fails. The mostly used method to avoid leakage is cleaning up after upstream leaky reactions, and it remains a challenge to develop reliable DNA strand displacement technique with low leakage. In this work, we address the challenge by experimentally evaluating the basic factors, including reaction time, ratio of reactants, and ion concentration to the leakage in DNA strand displacement. Specifically, fluorescent probes and a hairpin structure reporting DNA strand are designed to detect the output of DNA strand displacement, and thus can evaluate the leakage of DNA strand displacement reactions with different reaction time, ratios of reactants, and ion concentrations. From the obtained data, mathematical models for evaluating leakage are achieved by curve derivation. As a result, it is obtained that long time incubation, high concentration of fuel strand, and inappropriate amount of ion concentration can weaken leaky reactions. This contributes to a method to set proper reaction conditions to reduce leakage in DNA strand displacement.

  7. An experimental and theoretical study of reaction steps relevant to the methanol-to-hydrocarbons reaction

    Energy Technology Data Exchange (ETDEWEB)

    Svelle, Stian

    2004-07-01

    The primary objective of the present work is to obtain new insight into the reaction mechanism of the zeolite catalyzed methanol-to-hydrocarbons (MTH) reaction. It was decided to use both experimental and computational techniques to reach this goal. An investigation of the n-butene + methanol system was therefore initiated. Over time, it became apparent that it was possible to determine the rate for the methylation of n-butene by methanol. The ethene and propene systems were therefore reexamined in order to collect kinetic information also for those cases. With the development of user-friendly quantum chemistry programs such as the Gaussian suite of programs, the possibility of applying quantum chemical methods to many types of problems has become readily available even for non-experts. When performing mechanistic studies, there is quite often a considerable synergy effect when combining experimental and computational approaches. The methylation reactions mentioned above turned out to be an issue well suited for quantum chemical investigations. The incentive for examining the halomethane reactivity was the clear analogy to the MTH reaction system. Alkene dimerization was also a reaction readily examined with quantum chemistry. As discussed in the introduction of this thesis, polymethylbenzenes, or their cationic counterparts, are suspected to be key intermediates in the MTH reaction. It was therefore decided to investigate the intrinsic reactivity of these species in the gas-phase by employing sophisticated mass spectrometric (MS) techniques in collaboration with the MS group at the Department of Chemistry, University of Oslo The data thus obtained will also be compared with results from an ongoing computational study on gas phase polymethylbenzenium reactivity. 6 papers presenting various studies are included. The titles are: 1) A Theoretical Investigation of the Methylation of Alkenes with Methanol over Acidic Zeolites. 2) A Theoretical Investigation of the

  8. SkyNet: Modular nuclear reaction network library

    Science.gov (United States)

    Lippuner, Jonas; Roberts, Luke F.

    2017-10-01

    The general-purpose nuclear reaction network SkyNet evolves the abundances of nuclear species under the influence of nuclear reactions. SkyNet can be used to compute the nucleosynthesis evolution in all astrophysical scenarios where nucleosynthesis occurs. Any list of isotopes can be evolved and SkyNet supports various different types of nuclear reactions. SkyNet is modular, permitting new or existing physics, such as nuclear reactions or equations of state, to be easily added or modified.

  9. InSAR Scientific Computing Environment

    Science.gov (United States)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and

  10. BISEN: Biochemical simulation environment

    NARCIS (Netherlands)

    Vanlier, J.; Wu, F.; Qi, F.; Vinnakota, K.C.; Han, Y.; Dash, R.K.; Yang, F.; Beard, D.A.

    2009-01-01

    The Biochemical Simulation Environment (BISEN) is a suite of tools for generating equations and associated computer programs for simulating biochemical systems in the MATLAB® computing environment. This is the first package that can generate appropriate systems of differential equations for

  11. Energy Consumption and Indoor Environment Predicted by a Combination of Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm

    2003-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution is introduced for improvement of the predictions of both the energy consumption and the indoor environment.The article describes a calculation...

  12. Rate constant computation on some elementary reactions of Hg during combustion

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qing; Yang, Bo-wen; Bai, Jing-ru [Northeast Dianli Univ., Jilin (China). Inst. of Energy and Power Engineering

    2013-07-01

    The geometry optimizations of reactants, products and transition states were made by the quantum chemistry MP2 method at the SDD basis function level for Hg, and 6-311++G(3df, 3pd) for others. The properties of stable minimums were validated by vibration frequencies analysis. Furthermore, the microcosmic chemical reaction mechanisms of reactions were investigated by ab initio calculations of quantum chemistry. On the basis of the geometry optimization, reaction rate constants within 298-2,000 K are calculated neither from experimental data nor by estimated, but directly from Quantum Chemistry software-Khimera.

  13. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  14. Supporting Student Learning in Computer Science Education via the Adaptive Learning Environment ALMA

    Directory of Open Access Journals (Sweden)

    Alexandra Gasparinatou

    2015-10-01

    Full Text Available This study presents the ALMA environment (Adaptive Learning Models from texts and Activities. ALMA supports the processes of learning and assessment via: (1 texts differing in local and global cohesion for students with low, medium, and high background knowledge; (2 activities corresponding to different levels of comprehension which prompt the student to practically implement different text-reading strategies, with the recommended activity sequence adapted to the student’s learning style; (3 an overall framework for informing, guiding, and supporting students in performing the activities; and; (4 individualized support and guidance according to student specific characteristics. ALMA also, supports students in distance learning or in blended learning in which students are submitted to face-to-face learning supported by computer technology. The adaptive techniques provided via ALMA are: (a adaptive presentation and (b adaptive navigation. Digital learning material, in accordance with the text comprehension model described by Kintsch, was introduced into the ALMA environment. This material can be exploited in either distance or blended learning.

  15. Simulation of biochemical reactions with time-dependent rates by the rejection-based algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy)

    2015-08-07

    We address the problem of simulating biochemical reaction networks with time-dependent rates and propose a new algorithm based on our rejection-based stochastic simulation algorithm (RSSA) [Thanh et al., J. Chem. Phys. 141(13), 134116 (2014)]. The computation for selecting next reaction firings by our time-dependent RSSA (tRSSA) is computationally efficient. Furthermore, the generated trajectory is exact by exploiting the rejection-based mechanism. We benchmark tRSSA on different biological systems with varying forms of reaction rates to demonstrate its applicability and efficiency. We reveal that for nontrivial cases, the selection of reaction firings in existing algorithms introduces approximations because the integration of reaction rates is very computationally demanding and simplifying assumptions are introduced. The selection of the next reaction firing by our approach is easier while preserving the exactness.

  16. Spectroscopic and computational studies of ionic clusters as models of solvation and atmospheric reactions

    Science.gov (United States)

    Kuwata, Keith T.

    Ionic clusters are useful as model systems for the study of fundamental processes in solution and in the atmosphere. Their structure and reactivity can be studied in detail using vibrational predissociation spectroscopy, in conjunction with high level ab initio calculations. This thesis presents the applications of infrared spectroscopy and computation to a variety of gas-phase cluster systems. A crucial component of the process of stratospheric ozone depletion is the action of polar stratospheric clouds (PSCs) to convert the reservoir species HCl and chlorine nitrate (ClONO2) to photochemically labile compounds. Quantum chemistry was used to explore one possible mechanism by which this activation is effected: Cl- + ClONO2 /to Cl2 + NO3- eqno(1)Correlated ab initio calculations predicted that the direct reaction of chloride ion with ClONO2 is facile, which was confirmed in an experimental kinetics study. In the reaction a weakly bound intermediate Cl2-NO3- is formed, with ~70% of the charge localized on the nitrate moiety. This enables the Cl2-NO3- cluster to be well solvated even in bulk solution, allowing (1) to be facile on PSCs. Quantum chemistry was also applied to the hydration of nitrosonium ion (NO+), an important process in the ionosphere. The calculations, in conjunction with an infrared spectroscopy experiment, revealed the structure of the gas-phase clusters NO+(H2O)n. The large degree of covalent interaction between NO+ and the lone pairs of the H2O ligands is contrasted with the weak electrostatic bonding between iodide ion and H2O. Finally, the competition between ion solvation and solvent self-association is explored for the gas-phase clusters Cl/-(H2O)n and Cl-(NH3)n. For the case of water, vibrational predissociation spectroscopy reveals less hydrogen bonding among H2O ligands than predicted by ab initio calculations. Nevertheless, for n /ge 5, cluster structure is dominated by water-water interactions, with Cl- only partially solvated by the

  17. Computing in nonlinear media and automata collectives

    CERN Document Server

    Adamatzky, Andrew

    2001-01-01

    Reaction-diffusion, excitation, and computation. Subdivision of space. Computation on and with graphs. Computational universality of excitable media. Phenomenology of lattice excitation and emergence of computation.

  18. Constructing New Bioorthogonal Reagents and Reactions.

    Science.gov (United States)

    Row, R David; Prescher, Jennifer A

    2018-05-15

    Chemical tools are transforming our understanding of biomolecules and living systems. Included in this group are bioorthogonal reagents-functional groups that are inert to most biological species, but can be selectively ligated with complementary probes, even in live cells and whole organisms. Applications of these tools have revealed fundamental new insights into biomolecule structure and function-information often beyond the reach of genetic approaches. In many cases, the knowledge gained from bioorthogonal probes has enabled new questions to be asked and innovative research to be pursued. Thus, the continued development and application of these tools promises to both refine our view of biological systems and facilitate new discoveries. Despite decades of achievements in bioorthogonal chemistry, limitations remain. Several reagents are too large or insufficiently stable for use in cellular environments. Many bioorthogonal groups also cross-react with one another, restricting them to singular tasks. In this Account, we describe our work to address some of the voids in the bioorthogonal toolbox. Our efforts to date have focused on small reagents with a high degree of tunability: cyclopropenes, triazines, and cyclopropenones. These motifs react selectively with complementary reagents, and their unique features are enabling new pursuits in biology. The Account is organized by common themes that emerged in our development of novel bioorthogonal reagents and reactions. First, natural product structures can serve as valuable starting points for probe design. Cyclopropene, triazine, and cyclopropenone motifs are all found in natural products, suggesting that they would be metabolically stable and compatible with a variety of living systems. Second, fine-tuning bioorthogonal reagents is essential for their successful translation to biological systems. Different applications demand different types of probes; thus, generating a collection of tools that span a continuum of

  19. Evolutionary change in continuous reaction norms

    DEFF Research Database (Denmark)

    Murren, Courtney J; Maclean, Heidi J; Diamond, Sarah E

    2014-01-01

    Understanding the evolution of reaction norms remains a major challenge in ecology and evolution. Investigating evolutionary divergence in reaction norm shapes between populations and closely related species is one approach to providing insights. Here we use a meta-analytic approach to compare...... divergence in reaction norms of closely related species or populations of animals and plants across types of traits and environments. We quantified mean-standardized differences in overall trait means (Offset) and reaction norm shape (including both Slope and Curvature). These analyses revealed...... contributed to the best-fitting models, especially for Offset, Curvature, and the total differences (Total) between reaction norms. Congeneric species had greater differences in reaction norms than populations, and novel environmental conditions increased the differences in reaction norms between populations...

  20. GAS-PHASE SYNTHESIS OF PRECURSORS OF INTERSTELLAR GLYCINE: A COMPUTATIONAL STUDY OF THE REACTIONS OF ACETIC ACID WITH HYDROXYLAMINE AND ITS IONIZED AND PROTONATED DERIVATIVES

    Energy Technology Data Exchange (ETDEWEB)

    Barrientos, Carmen; Redondo, Pilar; Largo, Laura; Rayon, Victor M.; Largo, Antonio, E-mail: alargo@qf.uva.es [Departamento de Quimica Fisica y Quimica Inorganica, Facultad de Ciencias, Universidad de Valladolid, 47005 Valladolid (Spain)

    2012-04-01

    A computational study of the reactions of hydroxylamine and its ionized and protonated derivatives with acetic acid is provided. The reaction of neutral hydroxylamine with acetic acid, despite being clearly exothermic, involves a very large energy barrier. The reaction of ionized hydroxylamine with acetic acid is also clearly exothermic, but again a significant energy barrier is found (around 24 kcal mol{sup -1} at the CCSD(T) level). The reaction of the most stable protonated isomer of hydroxylamine, NH{sub 3}OH{sup +}, with acetic acid also involves a high barrier (more than 27 kcal mol{sup -1} at the CCSD(T) level). Only the higher energy isomer, NH{sub 2}OH{sup +}{sub 2}, leads to a sensibly lower energy barrier (about 2.3 kcal mol{sup -1} at the CCSD(T) level). Nevertheless, an estimate of the reaction coefficient at low temperatures such as those reigning in the interstellar medium gives very low values. Therefore, it seems that precursors of interstellar glycine could not be efficiently produced from the reactions of hydroxylamine-derived ions with acetic acid.

  1. GAS-PHASE SYNTHESIS OF PRECURSORS OF INTERSTELLAR GLYCINE: A COMPUTATIONAL STUDY OF THE REACTIONS OF ACETIC ACID WITH HYDROXYLAMINE AND ITS IONIZED AND PROTONATED DERIVATIVES

    International Nuclear Information System (INIS)

    Barrientos, Carmen; Redondo, Pilar; Largo, Laura; Rayón, Víctor M.; Largo, Antonio

    2012-01-01

    A computational study of the reactions of hydroxylamine and its ionized and protonated derivatives with acetic acid is provided. The reaction of neutral hydroxylamine with acetic acid, despite being clearly exothermic, involves a very large energy barrier. The reaction of ionized hydroxylamine with acetic acid is also clearly exothermic, but again a significant energy barrier is found (around 24 kcal mol –1 at the CCSD(T) level). The reaction of the most stable protonated isomer of hydroxylamine, NH 3 OH + , with acetic acid also involves a high barrier (more than 27 kcal mol –1 at the CCSD(T) level). Only the higher energy isomer, NH 2 OH + 2 , leads to a sensibly lower energy barrier (about 2.3 kcal mol –1 at the CCSD(T) level). Nevertheless, an estimate of the reaction coefficient at low temperatures such as those reigning in the interstellar medium gives very low values. Therefore, it seems that precursors of interstellar glycine could not be efficiently produced from the reactions of hydroxylamine-derived ions with acetic acid.

  2. A multilevel adaptive reaction-splitting method for SRNs

    KAUST Repository

    Moraes, Alvaro; Tempone, Raul; Vilanova, Pedro

    2016-01-01

    In [5], we present a novel multilevel Monte Carlo method for kinetic simulation of stochastic reaction networks (SRNs) specifically designed for systems in which the set of reaction channels can be adaptively partitioned into two subsets characterized by either high or low activity. To estimate expected values of observables of the system, our method bounds the global computational error to be below a prescribed tolerance, TOL, within a given confidence level. This is achieved with a computational complexity of order O(TOL-2). We also present a novel control variate technique which may dramatically reduce the variance of the coarsest level at a negligible computational cost.

  3. A multilevel adaptive reaction-splitting method for SRNs

    KAUST Repository

    Moraes, Alvaro

    2016-01-06

    In [5], we present a novel multilevel Monte Carlo method for kinetic simulation of stochastic reaction networks (SRNs) specifically designed for systems in which the set of reaction channels can be adaptively partitioned into two subsets characterized by either high or low activity. To estimate expected values of observables of the system, our method bounds the global computational error to be below a prescribed tolerance, TOL, within a given confidence level. This is achieved with a computational complexity of order O(TOL-2). We also present a novel control variate technique which may dramatically reduce the variance of the coarsest level at a negligible computational cost.

  4. Metal Catalyzed Fusion: Nuclear Active Environment vs. Process

    Science.gov (United States)

    Chubb, Talbot

    2009-03-01

    To achieve radiationless dd fusion and/or other LENR reactions via chemistry: some focus on environment of interior or altered near-surface volume of bulk metal; some on environment inside metal nanocrystals or on their surface; some on the interface between nanometal crystals and ionic crystals; some on a momentum shock-stimulation reaction process. Experiment says there is also a spontaneous reaction process.

  5. A room acoustical computer model for industrial environments - the model and its verification

    DEFF Research Database (Denmark)

    Christensen, Claus Lynge; Foged, Hans Torben

    1998-01-01

    This paper presents an extension to the traditional room acoustic modelling methods allowing computer modelling of huge machinery in industrial spaces. The program in question is Odeon 3.0 Industrial and Odeon 3.0 Combined which allows the modelling of point sources, surface sources and line...... of an omnidirectional sound source and a microphone. This allows the comparison of simulated results with the ones measured in real rooms. However when simulating the acoustic environment in industrial rooms, the sound sources are often far from being point like, as they can be distributed over a large space...

  6. Complex Reaction Environments and Competing Reaction Mechanisms in Zeolite Catalysis: Insights from Advanced Molecular Dynamics

    NARCIS (Netherlands)

    De Wispelaere, K.; Ensing, B.; Ghysels, A.; Meijer, E.J.; van Van Speybroeck, V.

    2015-01-01

    The methanol-to-olefin process is a showcase example of complex zeolite-catalyzed chemistry. At real operating conditions, many factors affect the reactivity, such as framework flexibility, adsorption of various guest molecules, and competitive reaction pathways. In this study, the strength of first

  7. Multiscale Investigation on Biofilm Distribution and Its Impact on Macroscopic Biogeochemical Reaction Rates

    Science.gov (United States)

    Yan, Zhifeng; Liu, Chongxuan; Liu, Yuanyuan; Bailey, Vanessa L.

    2017-11-01

    Biofilms are critical locations for biogeochemical reactions in the subsurface environment. The occurrence and distribution of biofilms at microscale as well as their impacts on macroscopic biogeochemical reaction rates are still poorly understood. This paper investigated the formation and distributions of biofilms in heterogeneous sediments using multiscale models and evaluated the effects of biofilm heterogeneity on local and macroscopic biogeochemical reaction rates. Sediment pore structures derived from X-ray computed tomography were used to simulate the microscale flow dynamics and biofilm distribution in the sediment column. The response of biofilm formation and distribution to the variations in hydraulic and chemical properties was first examined. One representative biofilm distribution was then utilized to evaluate its effects on macroscopic reaction rates using nitrate reduction as an example. The results revealed that microorganisms primarily grew on the surfaces of grains and aggregates near preferential flow paths where both electron donor and acceptor were readily accessible, leading to the heterogeneous distribution of biofilms in the sediments. The heterogeneous biofilm distribution decreased the macroscopic rate of biogeochemical reactions as compared with those in homogeneous cases. Operationally considering the heterogeneous biofilm distribution in macroscopic reactive transport models such as using dual porosity domain concept can significantly improve the prediction of biogeochemical reaction rates. Overall, this study provided important insights into the biofilm formation and distribution in soils and sediments as well as their impacts on the macroscopic manifestation of reaction rates.

  8. Reaction of Aldehydes/Ketones with Electron-Deficient 1,3,5-Triazines Leading to Functionalized Pyrimidines as Diels-Alder/Retro-Diels-Alder Reaction Products: Reaction Development and Mechanistic Studies.

    Science.gov (United States)

    Yang, Kai; Dang, Qun; Cai, Pei-Jun; Gao, Yang; Yu, Zhi-Xiang; Bai, Xu

    2017-03-03

    Catalytic inverse electron demand Diels-Alder (IEDDA) reactions of heterocyclic aza-dienes are rarely reported since highly reactive and electron-rich dienophiles are often found not compatible with strong acids such as Lewis acids. Herein, we disclose that TFA-catalyzed reactions of electron-deficient 1,3,5-triazines and electron-deficient aldehydes/ketones can take place. These reactions led to highly functionalized pyrimidines as products in fair to good yields. The reaction mechanism was carefully studied by the combination of experimental and computational studies. The reactions involve a cascade of stepwise inverse electron demand hetero-Diels-Alder (ihDA) reactions, followed by retro-Diels-Alder (rDA) reactions and elimination of water. An acid was required for both ihDA and rDA reactions. This mechanism was further verified by comparing the relative reactivity of aldehydes/ketones and their corresponding vinyl ethers in the current reaction system.

  9. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  10. Wang-Landau Reaction Ensemble Method: Simulation of Weak Polyelectrolytes and General Acid-Base Reactions.

    Science.gov (United States)

    Landsgesell, Jonas; Holm, Christian; Smiatek, Jens

    2017-02-14

    We present a novel method for the study of weak polyelectrolytes and general acid-base reactions in molecular dynamics and Monte Carlo simulations. The approach combines the advantages of the reaction ensemble and the Wang-Landau sampling method. Deprotonation and protonation reactions are simulated explicitly with the help of the reaction ensemble method, while the accurate sampling of the corresponding phase space is achieved by the Wang-Landau approach. The combination of both techniques provides a sufficient statistical accuracy such that meaningful estimates for the density of states and the partition sum can be obtained. With regard to these estimates, several thermodynamic observables like the heat capacity or reaction free energies can be calculated. We demonstrate that the computation times for the calculation of titration curves with a high statistical accuracy can be significantly decreased when compared to the original reaction ensemble method. The applicability of our approach is validated by the study of weak polyelectrolytes and their thermodynamic properties.

  11. Examining the Roles of Blended Learning Approaches in Computer-Supported Collaborative Learning (CSCL) Environments: A Delphi Study

    Science.gov (United States)

    So, Hyo-Jeong; Bonk, Curtis J.

    2010-01-01

    In this study, a Delphi method was used to identify and predict the roles of blended learning approaches in computer-supported collaborative learning (CSCL) environments. The Delphi panel consisted of experts in online learning from different geographic regions of the world. This study discusses findings related to (a) pros and cons of blended…

  12. Characterising Complex Enzyme Reaction Data.

    Directory of Open Access Journals (Sweden)

    Handan Melike Dönertaş

    Full Text Available The relationship between enzyme-catalysed reactions and the Enzyme Commission (EC number, the widely accepted classification scheme used to characterise enzyme activity, is complex and with the rapid increase in our knowledge of the reactions catalysed by enzymes needs revisiting. We present a manual and computational analysis to investigate this complexity and found that almost one-third of all known EC numbers are linked to more than one reaction in the secondary reaction databases (e.g., KEGG. Although this complexity is often resolved by defining generic, alternative and partial reactions, we have also found individual EC numbers with more than one reaction catalysing different types of bond changes. This analysis adds a new dimension to our understanding of enzyme function and might be useful for the accurate annotation of the function of enzymes and to study the changes in enzyme function during evolution.

  13. Gas-to-particle conversion in the atmospheric environment by radiation-induced and photochemical reactions

    International Nuclear Information System (INIS)

    Vohra, K.G.

    1975-01-01

    During the last few years a fascinating new area of research involving ionizing radiations and photochemistry in gas-to-particle conversion in the atmosphere has been developing at a rapid pace. Two problems of major interest and concern in which this is of paramount importance are: (1) radiation induced and photochemical aerosol formation in the stratosphere and, (2) role of radiations and photochemistry in smog formation. The peak in cosmic ray intensity and significant solar UV flux in the stratosphere lead to complex variety of reactions involving major and trace constituents in this region of the atmosphere, and some of these reactions are of vital importance in aerosol formation. The problem is of great current interest because the pollutant gases from industrial sources and future SST operations entering the stratosphere could increase the aerosol burden in the stratosphere and affect the solar energy input of the troposphere with consequent ecological and climatic changes. On the other hand, in the nuclear era, the atmospheric releases from reactors and processing plants could lead to changes in the cloud nucleation behaviour of the environment and possible increase in smog formation in the areas with significant levels of radiations and conventional pollutants. A review of the earlier work, current status of the problem, and conventional pollutants. A review of the earlier work, current status of the problem, and some recent results of the experiments conducted in the author's laboratory are presented. The possible mechanisms of gas-to-particle conversion in the atmosphere have been explained

  14. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  15. Enabling Computational Dynamics in Distributed Computing Environments Using a Heterogeneous Computing Template

    Science.gov (United States)

    2011-08-09

    heterogeneous computing concept advertised recently as the paradigm capable of delivering exascale flop rates by the end of the decade. In this framework...and Lamb. Page 10 of 10 UNCLASSIFIED [3] Skaugen, K., Petascale to Exascale : Extending Intel’s HPC Commitment: http://download.intel.com

  16. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    Science.gov (United States)

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; West, Richard H.

    2016-06-01

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involving carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.

  17. Environment Modules on the Peregrine System | High-Performance Computing |

    Science.gov (United States)

    NREL Environment Modules on the Peregrine System Environment Modules on the Peregrine System Peregrine uses environment modules to easily manage software environments. Environment modules facilitate modules commands set up a basic environment for the default compilers, tools and libraries, such as the

  18. The Effect of Simulated Microgravity Environment of RWV Bioreactors on Surface Reactions and Adsorption of Serum Proteins on Bone-bioactive Microcarriers

    Science.gov (United States)

    Radin, Shula; Ducheyne, P.; Ayyaswamy, P. S.

    2003-01-01

    Biomimetically modified bioactive materials with bone-like surface properties are attractive candidates for use as microcarriers for 3-D bone-like tissue engineering under simulated microgravity conditions of NASA designed rotating wall vessel (RWV) bioreactors. The simulated microgravity environment is attainable under suitable parametric conditions of the RWV bioreactors. Ca-P containing bioactive glass (BG), whose stimulatory effect on bone cell function had been previously demonstrated, was used in the present study. BG surface modification via reactions in solution, resulting formation of bone-like minerals at the surface and adsorption of serum proteins is critical for obtaining the stimulatory effect. In this paper, we report on the major effects of simulated microgravity conditions of the RWV on the BG reactions surface reactions and protein adsorption in physiological solutions. Control tests at normal gravity were conducted at static and dynamic conditions. The study revealed that simulated microgravity remarkably enhanced reactions involved in the BG surface modification, including BG dissolution, formation of bone-like minerals at the surface and adsorption of serum proteins. Simultaneously, numerical models were developed to simulate the mass transport of chemical species to and from the BG surface under normal gravity and simulated microgravity conditions. The numerical results showed an excellent agreement with the experimental data at both testing conditions.

  19. Lagrangian descriptors of driven chemical reaction manifolds.

    Science.gov (United States)

    Craven, Galen T; Junginger, Andrej; Hernandez, Rigoberto

    2017-08-01

    The persistence of a transition state structure in systems driven by time-dependent environments allows the application of modern reaction rate theories to solution-phase and nonequilibrium chemical reactions. However, identifying this structure is problematic in driven systems and has been limited by theories built on series expansion about a saddle point. Recently, it has been shown that to obtain formally exact rates for reactions in thermal environments, a transition state trajectory must be constructed. Here, using optimized Lagrangian descriptors [G. T. Craven and R. Hernandez, Phys. Rev. Lett. 115, 148301 (2015)PRLTAO0031-900710.1103/PhysRevLett.115.148301], we obtain this so-called distinguished trajectory and the associated moving reaction manifolds on model energy surfaces subject to various driving and dissipative conditions. In particular, we demonstrate that this is exact for harmonic barriers in one dimension and this verification gives impetus to the application of Lagrangian descriptor-based methods in diverse classes of chemical reactions. The development of these objects is paramount in the theory of reaction dynamics as the transition state structure and its underlying network of manifolds directly dictate reactivity and selectivity.

  20. Numerical thermal-hydraulics study on sodium-water reaction phenomena

    International Nuclear Information System (INIS)

    Takashi, Takata; Akira, Yamaguchi

    2003-01-01

    A new computational program SERAPHIM (Sodium-watEr Reaction Analysis: PHysics of Interdisciplinary Multi-phase flow) is developed to investigate the Sodium-Water Reaction (SWR) phenomena based on parallel computation technology. A compressible three-fluid (liquid water, liquid sodium and mixture gas) and one-pressure model is adopted for multi-phase calculation. The Highly Simplified Maker And Cell (HSMAC) method considering with compressibility is implemented as the numerical solution. The Message-Passing Interface (MPI) is used for the parallel computation. Two types of reactions are considered for the SWR modeling; one is a surface reaction and the other is a gas phase reaction. The surface reaction model assumes that liquid sodium reacts with water vapor on the surface of liquid sodium. An analogy of heat transfer and mass transfer is applied in this model. Reaction heating vaporizes liquid sodium resulting in the gas phase reaction. The ab initio molecular orbital method is applied to investigate the reaction mechanism and evaluate the reaction rate described by the Arrhenius law. A performance of parallel computation is tested on the cluster-PC (16 CPUs) system. The execution time becomes 17.1 times faster in case of 16 CPUs. It seems promising that the SERAPHIM code is practicable for large-scale analysis of the SWR phenomena. Three-dimensional SWR analyses are also carried out to investigate the characteristics of the thermal-hydraulics with the SWR and an influence of initial pressure (0.2 MPa and 0.6 MPa) on an early stage of the SWR phenomenon. As a result, distribution of a gas region, in which water vapor or product of the SWR such as hydrogen and sodium hydroxide exits, velocity and high temperature region differs by 0.2 MPa and 0.6 MPa conditions. However, the maximum gas temperature has an upper bounding and is almost constant both in the analyses. The reason of the upper bounding is attributed to the fact that a hydrogen gas covers up a liquid

  1. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-01-01

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment. PMID:28629131

  2. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment.

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-06-17

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment.

  3. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle.

    Science.gov (United States)

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs) . Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  4. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    Science.gov (United States)

    Cerezo, Rebeca; Esteban, María; Sánchez-Santillán, Miguel; Núñez, José C.

    2017-01-01

    Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs). Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques. Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment) Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples. Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance. Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages. PMID:28883801

  5. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle

    Directory of Open Access Journals (Sweden)

    Rebeca Cerezo

    2017-08-01

    Full Text Available Introduction: Research about student performance has traditionally considered academic procrastination as a behavior that has negative effects on academic achievement. Although there is much evidence for this in class-based environments, there is a lack of research on Computer-Based Learning Environments (CBLEs. Therefore, the purpose of this study is to evaluate student behavior in a blended learning program and specifically procrastination behavior in relation to performance through Data Mining techniques.Materials and Methods: A sample of 140 undergraduate students participated in a blended learning experience implemented in a Moodle (Modular Object Oriented Developmental Learning Environment Management System. Relevant interaction variables were selected for the study, taking into account student achievement and analyzing data by means of association rules, a mining technique. The association rules were arrived at and filtered through two selection criteria: 1, rules must have an accuracy over 0.8 and 2, they must be present in both sub-samples.Results: The findings of our study highlight the influence of time management in online learning environments, particularly on academic achievement, as there is an association between procrastination variables and student performance.Conclusion: Negative impact of procrastination in learning outcomes has been observed again but in virtual learning environments where practical implications, prevention of, and intervention in, are different from class-based learning. These aspects are discussed to help resolve student difficulties at various ages.

  6. The Case for Higher Computational Density in the Memory-Bound FDTD Method within Multicore Environments

    Directory of Open Access Journals (Sweden)

    Mohammed F. Hadi

    2012-01-01

    Full Text Available It is argued here that more accurate though more compute-intensive alternate algorithms to certain computational methods which are deemed too inefficient and wasteful when implemented within serial codes can be more efficient and cost-effective when implemented in parallel codes designed to run on today's multicore and many-core environments. This argument is most germane to methods that involve large data sets with relatively limited computational density—in other words, algorithms with small ratios of floating point operations to memory accesses. The examples chosen here to support this argument represent a variety of high-order finite-difference time-domain algorithms. It will be demonstrated that a three- to eightfold increase in floating-point operations due to higher-order finite-differences will translate to only two- to threefold increases in actual run times using either graphical or central processing units of today. It is hoped that this argument will convince researchers to revisit certain numerical techniques that have long been shelved and reevaluate them for multicore usability.

  7. Effects of network dissolution changes on pore-to-core upscaled reaction rates for kaolinite and anorthite reactions under acidic conditions

    KAUST Repository

    Kim, Daesang

    2013-11-01

    We have extended reactive flow simulation in pore-network models to include geometric changes in the medium from dissolution effects. These effects include changes in pore volume and reactive surface area, as well as topological changes that open new connections. The computed changes were based upon a mineral map from an X-ray computed tomography image of a sandstone core. We studied the effect of these changes on upscaled (pore-scale to core-scale) reaction rates and compared against the predictions of a continuum model. Specifically, we modeled anorthite and kaolinite reactions under acidic flow conditions during which the anorthite reactions remain far from equilibrium (dissolution only), while the kaolinite reactions can be near-equilibrium. Under dissolution changes, core-scale reaction rates continuously and nonlinearly evolved in time. At higher injection rates, agreement with predictions of the continuum model degraded significantly. For the far-from-equilibrium reaction, our results indicate that the ability to correctly capture the heterogeneity in dissolution changes in the reactive mineral surface area is critical to accurately predict upscaled reaction rates. For the near-equilibrium reaction, the ability to correctly capture the heterogeneity in the saturation state remains critical. Inclusion of a Nernst-Planck term to ensure neutral ionic currents under differential diffusion resulted in at most a 9% correction in upscaled rates.

  8. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  9. Development and Study the Usage of Blended Learning Environment Model Using Engineering Design Concept Learning Activities to Computer Programming Courses for Undergraduate Students of Rajabhat Universities

    Directory of Open Access Journals (Sweden)

    Kasame Tritrakan

    2017-06-01

    Full Text Available The objectives of this research were to study and Synthesise the components, to develop, and to study the usage of blended learning environment model using engineering design concept learning activities to computer programming courses for undergraduate students of Rajabhat universities. The research methodology was divided into 3 phases. Phase I: surveying presents, needs and problems in teaching computer programming of 52 lecturers by using in-depth interview from 5 experienced lecturers. The model’s elements were evaluated by 5 experts. The tools were questionnaire, interview form, and model’s elements assessment form. Phase II: developing the model of blended learning environment and learning activities based on engineering design processes and confirming model by 8 experts. The tools were the draft of learning environment, courseware, and assessment forms. Phase III evaluating the effects of using the implemented environment. The samples were students which formed into 2 groups, 25 people in the experiment group and 27 people in the control group by cluster random sampling. The tools were learning environment, courseware, and assessment tools. The statistics used in this research were means, standard deviation, t-test dependent, and one-way MANOVA. The results found that: 1 Lecturers quite agreed with the physical, mental, social, and information learning environment, learning processes, and assessments. There were all needs in high level. However there were physical environment problems in high level yet quite low in other aspects. 2 The developed learning environment had 4 components which were a 4 types of environments b the inputs included blended learning environment, learning motivation factors, and computer programming content c the processes were analysis of state objectives, design learning environment and activities, developing learning environment and testing materials, implement, ation evaluation and evaluate, 4 the outputs

  10. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Using Python as a first programming environment for computational physics in developing countries

    Science.gov (United States)

    Akpojotor, Godfrey; Ehwerhemuepha, Louis; Echenim, Myron; Akpojotor, Famous

    2011-03-01

    Python unique features such its interpretative, multiplatform and object oriented nature as well as being a free and open source software creates the possibility that any user connected to the internet can download the entire package into any platform, install it and immediately begin to use it. Thus Python is gaining reputation as a preferred environment for introducing students and new beginners to programming. Therefore in Africa, the Python African Tour project has been launched and we are coordinating its use in computational science. We examine here the challenges and prospects of using Python for computational physics (CP) education in developing countries (DC). Then we present our project on using Python to simulate and aid the learning of laboratory experiments illustrated here by modeling of the simple pendulum and also to visualize phenomena in physics illustrated here by demonstrating the wave motion of a particle in a varying potential. This project which is to train both the teachers and our students on CP using Python can easily be adopted in other DC.

  12. A priori modeling of chemical reactions on computational grid platforms: Workflows and data models

    International Nuclear Information System (INIS)

    Rampino, S.; Monari, A.; Rossi, E.; Evangelisti, S.; Laganà, A.

    2012-01-01

    Graphical abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS assembled on the European Grid allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Highlights: ► The grid based GEMS simulator accurately models small chemical systems. ► Q5Cost and D5Cost file formats provide interoperability in the workflow. ► Benchmark runs on H + H 2 highlight the Grid empowering. ► O + O 2 and N + N 2 calculated k (T)’s fall within the error bars of the experiment. - Abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS has been assembled on the segment of the European Grid devoted to the Computational Chemistry Virtual Organization. The related grid based workflow allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Interoperability between computational codes across the different stages of the workflow was made possible by the use of the common data formats Q5Cost and D5Cost. Illustrative benchmark runs have been performed on the prototype H + H 2 , N + N 2 and O + O 2 gas phase exchange reactions and thermal rate coefficients have been calculated for the last two. Results are discussed in terms of the modeling of the interaction and advantages of using the Grid is highlighted.

  13. Effects of network dissolution changes on pore-to-core upscaled reaction rates for kaolinite and anorthite reactions under acidic conditions

    KAUST Repository

    Kim, Daesang; Lindquist, W. Brent

    2013-01-01

    new connections. The computed changes were based upon a mineral map from an X-ray computed tomography image of a sandstone core. We studied the effect of these changes on upscaled (pore-scale to core-scale) reaction rates and compared against

  14. Computational investigation of kinetics of cross-linking reactions in proteins: importance in structure prediction.

    Science.gov (United States)

    Bandyopadhyay, Pradipta; Kuntz, Irwin D

    2009-01-01

    The determination of protein structure using distance constraints is a new and promising field of study. One implementation involves attaching residues of a protein using a cross-linking agent, followed by protease digestion, analysis of the resulting peptides by mass spectroscopy, and finally sequence threading to detect the protein folds. In the present work, we carry out computational modeling of the kinetics of cross-linking reactions in proteins using the master equation approach. The rate constants of the cross-linking reactions are estimated using the pKas and the solvent-accessible surface areas of the residues involved. This model is tested with fibroblast growth factor (FGF) and cytochrome C. It is consistent with the initial experimental rate data for individual lysine residues for cytochrome C. Our model captures all observed cross-links for FGF and almost 90% of the observed cross-links for cytochrome C, although it also predicts cross-links that were not observed experimentally (false positives). However, the analysis of the false positive results is complicated by the fact that experimental detection of cross-links can be difficult and may depend on specific experimental conditions such as pH, ionic strength. Receiver operator characteristic plots showed that our model does a good job in predicting the observed cross-links. Molecular dynamics simulations showed that for cytochrome C, in general, the two lysines come closer for the observed cross-links as compared to the false positive ones. For FGF, no such clear pattern exists. The kinetic model and MD simulation can be used to study proposed cross-linking protocols.

  15. Second language writing anxiety, computer anxiety, and performance in a classroom versus a web-based environment

    Directory of Open Access Journals (Sweden)

    Effie Dracopoulos

    2011-04-01

    Full Text Available This study examined the impact of writing anxiety and computer anxiety on language learning for 45 ESL adult learners enrolled in an English grammar and writing course. Two sections of the course were offered in a traditional classroom setting whereas two others were given in a hybrid form that involved distance learning. Contrary to previous research, writing anxiety showed no correlation with learning performance, whereas computer anxiety only yielded a positive correlation with performance in the case of classroom learners. There were no significant differences across learning environments on any measures. These observations are discussed in light of the role computer technologies now play in our society as well as the merging of socio-demographic profiles between classroom and distance learners. Our data suggest that comparisons of profiles between classroom and distance learners may not be an issue worth investigating anymore in language studies, at least in developed countries.

  16. Efficient rejection-based simulation of biochemical reactions with stochastic noise and delays

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento (Italy)

    2014-10-07

    We propose a new exact stochastic rejection-based simulation algorithm for biochemical reactions and extend it to systems with delays. Our algorithm accelerates the simulation by pre-computing reaction propensity bounds to select the next reaction to perform. Exploiting such bounds, we are able to avoid recomputing propensities every time a (delayed) reaction is initiated or finished, as is typically necessary in standard approaches. Propensity updates in our approach are still performed, but only infrequently and limited for a small number of reactions, saving computation time and without sacrificing exactness. We evaluate the performance improvement of our algorithm by experimenting with concrete biological models.

  17. Cloud Computing as Network Environment in Students Work

    OpenAIRE

    Piotrowski, Dominik Mirosław

    2013-01-01

    The purpose of the article was to show the need for literacy education from a variety of services available in the cloud computing as a specialist information field of activity. Teaching at university in the field of cloud computing related to the management of information could provide tangible benefits in the form of useful learning outcomes. This allows students and future information professionals to begin enjoying the benefits of cloud computing SaaS model at work, thereby freeing up of...

  18. Conservation Laws in Biochemical Reaction Networks

    DEFF Research Database (Denmark)

    Mahdi, Adam; Ferragut, Antoni; Valls, Claudia

    2017-01-01

    We study the existence of linear and nonlinear conservation laws in biochemical reaction networks with mass-action kinetics. It is straightforward to compute the linear conservation laws as they are related to the left null-space of the stoichiometry matrix. The nonlinear conservation laws...... are difficult to identify and have rarely been considered in the context of mass-action reaction networks. Here, using the Darboux theory of integrability, we provide necessary structural (i.e., parameterindependent) conditions on a reaction network to guarantee the existence of nonlinear conservation laws...

  19. Hydrodynamics and Water Quality forecasting over a Cloud Computing environment: INDIGO-DataCloud

    Science.gov (United States)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; García, Daniel; Monteoliva, Agustín

    2017-04-01

    Algae Bloom due to eutrophication is an extended problem for water reservoirs and lakes that impacts directly in water quality. It can create a dead zone that lacks enough oxygen to support life and it can also be human harmful, so it must be controlled in water masses for supplying, bathing or other uses. Hydrodynamic and Water Quality modelling can contribute to forecast the status of the water system in order to alert authorities before an algae bloom event occurs. It can be used to predict scenarios and find solutions to reduce the harmful impact of the blooms. High resolution models need to process a big amount of data using a robust enough computing infrastructure. INDIGO-DataCloud (https://www.indigo-datacloud.eu/) is an European Commission funded project that aims at developing a data and computing platform targeting scientific communities, deployable on multiple hardware and provisioned over hybrid (private or public) e-infrastructures. The project addresses the development of solutions for different Case Studies using different Cloud-based alternatives. In the first INDIGO software release, a set of components are ready to manage the deployment of services to perform N number of Delft3D simulations (for calibrating or scenario definition) over a Cloud Computing environment, using the Docker technology: TOSCA requirement description, Docker repository, Orchestrator, AAI (Authorization, Authentication) and OneData (Distributed Storage System). Moreover, the Future Gateway portal based on Liferay, provides an user-friendly interface where the user can configure the simulations. Due to the data approach of INDIGO, the developed solutions can contribute to manage the full data life cycle of a project, thanks to different tools to manage datasets or even metadata. Furthermore, the cloud environment contributes to provide a dynamic, scalable and easy-to-use framework for non-IT experts users. This framework is potentially capable to automatize the processing of

  20. Discharge Fee Policy Analysis: A Computable General Equilibrium (CGE) Model of Water Resources and Water Environments

    OpenAIRE

    Guohua Fang; Ting Wang; Xinyi Si; Xin Wen; Yu Liu

    2016-01-01

    To alleviate increasingly serious water pollution and shortages in developing countries, various kinds of policies have been implemented by local governments. It is vital to quantify and evaluate the performance and potential economic impacts of these policies. This study develops a Computable General Equilibrium (CGE) model to simulate the regional economic and environmental effects of discharge fees. Firstly, water resources and water environment factors are separated from the input and out...

  1. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  2. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  3. A TRUSTWORTHY CLOUD FORENSICS ENVIRONMENT

    OpenAIRE

    Zawoad , Shams; Hasan , Ragib

    2015-01-01

    Part 5: CLOUD FORENSICS; International audience; The rapid migration from traditional computing and storage models to cloud computing environments has made it necessary to support reliable forensic investigations in the cloud. However, current cloud computing environments often lack support for forensic investigations and the trustworthiness of evidence is often questionable because of the possibility of collusion between dishonest cloud providers, users and forensic investigators. This chapt...

  4. Perceptions and experiences of, and outcomes for, university students in culturally diversified dyads in a computer-supported collaborative learning environment

    NARCIS (Netherlands)

    Popov, Vitaliy; Noroozi, Omid; Barrett, Jennifer B.; Biemans, Harm J A; Teasley, Stephanie D.; Slof, Bert; Mulder, Martin

    2014-01-01

    The introduction of computer-supported collaborative learning (CSCL), specifically into intercultural learning environments, mirrors the largely internet-based and intercultural workplace of many professionals. This paper utilized a mixed methods approach to examine differences between students'

  5. Perceptions and experiences of, and outcomes for, university students in culturally diversified dyads in a computer-supported collaborative learning environment

    NARCIS (Netherlands)

    Popov, V.; Noroozi, O.; Barrett, J.B.; Biemans, H.J.A.; Teasley, S.D.; Slof, B.; Mulder, M.

    2014-01-01

    The introduction of computer-supported collaborative learning (CSCL), specifically into intercultural learning environments, mirrors the largely internet-based and intercultural workplace of many professionals. This paper utilized a mixed methods approach to examine differences between students’

  6. Uma perspectiva computacional sobre catálise enzimática A computational perspective on enzymatic catalysis

    Directory of Open Access Journals (Sweden)

    Guilherme M. Arantes

    2008-01-01

    Full Text Available Enzymes are extremely efficient catalysts. Here, part of the mechanisms proposed to explain this catalytic power will be compared to quantitative experimental results and computer simulations. Influence of the enzymatic environment over species along the reaction coordinate will be analysed. Concepts of transition state stabilisation and reactant destabilisation will be confronted. Divided site model and near-attack conformation hypotheses will also be discussed. Molecular interactions such as covalent catalysis, general acid-base catalysis, electrostatics, entropic effects, steric hindrance, quantum and dynamical effects will also be analysed as sources of catalysis. Reaction mechanisms, in particular that catalysed by protein tyrosine phosphatases, illustrate the concepts.

  7. Iodine in the environment revisited

    International Nuclear Information System (INIS)

    Christiansen, J.V.; Carlsen, L.

    1989-05-01

    The report gives an overview of the environmental cycle of iodine, especially focusing on the possible reactions being responsible for the retention of iodine in the terrestrial environment. During the last two decades evidence for the presence of iodine in soil as organically bound has been presented. The major part of inorganic iodine in the terrestrial environment will, under physical and chemical conditions normally prevailing, exist as iodide. No evidence for a direct reaction between iodide and organic material has been presented, whereas strong support for the engagement of microbial activity in the formation of organic iodine compounds in soil has been obtained. Incorporation of iodine in humic substances as a result of enzymatic catalysis, involving an enzyme of the perozidase group apperas reasonable. It is concluded that microbiological activity involving extracellular enzymes most probably is responsible for the possible retention of iodine in the terrestrial environment. It is suggested that these reactions in detail should be studied experimentally. (author) 3 tabs., 2 ills., 51 refs

  8. Distributed metadata in a high performance computing environment

    Science.gov (United States)

    Bent, John M.; Faibish, Sorin; Zhang, Zhenhua; Liu, Xuezhao; Tang, Haiying

    2017-07-11

    A computer-executable method, system, and computer program product for managing meta-data in a distributed storage system, wherein the distributed storage system includes one or more burst buffers enabled to operate with a distributed key-value store, the co computer-executable method, system, and computer program product comprising receiving a request for meta-data associated with a block of data stored in a first burst buffer of the one or more burst buffers in the distributed storage system, wherein the meta data is associated with a key-value, determining which of the one or more burst buffers stores the requested metadata, and upon determination that a first burst buffer of the one or more burst buffers stores the requested metadata, locating the key-value in a portion of the distributed key-value store accessible from the first burst buffer.

  9. Detailed Reaction Kinetics for CFD Modeling of Nuclear Fuel Pellet Coating for High Temperature Gas-Cooled Reactors

    International Nuclear Information System (INIS)

    Battaglia, Francine

    2008-01-01

    The research project was related to the Advanced Fuel Cycle Initiative and was in direct alignment with advancing knowledge in the area of Nuclear Fuel Development related to the use of TRISO fuels for high-temperature reactors. The importance of properly coating nuclear fuel pellets received a renewed interest for the safe production of nuclear power to help meet the energy requirements of the United States. High-temperature gas-cooled nuclear reactors use fuel in the form of coated uranium particles, and it is the coating process that was of importance to this project. The coating process requires four coating layers to retain radioactive fission products from escaping into the environment. The first layer consists of porous carbon and serves as a buffer layer to attenuate the fission and accommodate the fuel kernel swelling. The second (inner) layer is of pyrocarbon and provides protection from fission products and supports the third layer, which is silicon carbide. The final (outer) layer is also pyrocarbon and provides a bonding surface and protective barrier for the entire pellet. The coating procedures for the silicon carbide and the outer pyrocarbon layers require knowledge of the detailed kinetics of the reaction processes in the gas phase and at the surfaces where the particles interact with the reactor walls. The intent of this project was to acquire detailed information on the reaction kinetics for the chemical vapor deposition (CVD) of carbon and silicon carbine on uranium fuel pellets, including the location of transition state structures, evaluation of the associated activation energies, and the use of these activation energies in the prediction of reaction rate constants. After the detailed reaction kinetics were determined, the reactions were implemented and tested in a computational fluid dynamics model, MFIX. The intention was to find a reduced mechanism set to reduce the computational time for a simulation, while still providing accurate results

  10. Effective dynamics along given reaction coordinates, and reaction rate theory.

    Science.gov (United States)

    Zhang, Wei; Hartmann, Carsten; Schütte, Christof

    2016-12-22

    In molecular dynamics and related fields one considers dynamical descriptions of complex systems in full (atomic) detail. In order to reduce the overwhelming complexity of realistic systems (high dimension, large timescale spread, limited computational resources) the projection of the full dynamics onto some reaction coordinates is examined in order to extract statistical information like free energies or reaction rates. In this context, the effective dynamics that is induced by the full dynamics on the reaction coordinate space has attracted considerable attention in the literature. In this article, we contribute to this discussion: we first show that if we start with an ergodic diffusion process whose invariant measure is unique then these properties are inherited by the effective dynamics. Then, we give equations for the effective dynamics, discuss whether the dominant timescales and reaction rates inferred from the effective dynamics are accurate approximations of such quantities for the full dynamics, and compare our findings to results from approaches like Mori-Zwanzig, averaging, or homogenization. Finally, by discussing the algorithmic realization of the effective dynamics, we demonstrate that recent algorithmic techniques like the "equation-free" approach and the "heterogeneous multiscale method" can be seen as special cases of our approach.

  11. Ubiquitous Computing in Physico-Spatial Environments

    DEFF Research Database (Denmark)

    Dalsgård, Peter; Eriksson, Eva

    2007-01-01

    Interaction design of pervasive and ubiquitous computing (UC) systems must take into account physico-spatial issues as technology is implemented into our physical surroundings. In this paper we discuss how one conceptual framework for understanding interaction in context, Activity Theory (AT...

  12. Calculation of reaction energies and adiabatic temperatures for waste tank reactions

    International Nuclear Information System (INIS)

    Burger, L.L.

    1995-10-01

    Continual concern has been expressed over potentially hazardous exothermic reactions that might occur in Hanford Site underground waste storage tanks. These tanks contain many different oxidizable compounds covering a wide range of concentrations. The chemical hazards are a function of several interrelated factors, including the amount of energy (heat) produced, how fast it is produced, and the thermal absorption and heat transfer properties of the system. The reaction path(s) will determine the amount of energy produced and kinetics will determine the rate that it is produced. The tanks also contain many inorganic compounds inert to oxidation. These compounds act as diluents and can inhibit exothermic reactions because of their heat capacity and thus, in contrast to the oxidizable compounds, provide mitigation of hazardous reactions. In this report the energy that may be released when various organic and inorganic compounds react is computed as a function of the reaction-mix composition and the temperature. The enthalpy, or integrated heat capacity, of these compounds and various reaction products is presented as a function of temperature; the enthalpy of a given mixture can then be equated to the energy release from various reactions to predict the maximum temperature which may be reached. This is estimated for several different compositions. Alternatively, the amounts of various diluents required to prevent the temperature from reaching a critical value can be estimated. Reactions taking different paths, forming different products such as N 2 O in place of N 2 are also considered, as are reactions where an excess of caustic is present. Oxidants other than nitrate and nitrite are considered briefly

  13. Calculation of reaction energies and adiabatic temperatures for waste tank reactions

    Energy Technology Data Exchange (ETDEWEB)

    Burger, L.L.

    1995-10-01

    Continual concern has been expressed over potentially hazardous exothermic reactions that might occur in Hanford Site underground waste storage tanks. These tanks contain many different oxidizable compounds covering a wide range of concentrations. The chemical hazards are a function of several interrelated factors, including the amount of energy (heat) produced, how fast it is produced, and the thermal absorption and heat transfer properties of the system. The reaction path(s) will determine the amount of energy produced and kinetics will determine the rate that it is produced. The tanks also contain many inorganic compounds inert to oxidation. These compounds act as diluents and can inhibit exothermic reactions because of their heat capacity and thus, in contrast to the oxidizable compounds, provide mitigation of hazardous reactions. In this report the energy that may be released when various organic and inorganic compounds react is computed as a function of the reaction-mix composition and the temperature. The enthalpy, or integrated heat capacity, of these compounds and various reaction products is presented as a function of temperature; the enthalpy of a given mixture can then be equated to the energy release from various reactions to predict the maximum temperature which may be reached. This is estimated for several different compositions. Alternatively, the amounts of various diluents required to prevent the temperature from reaching a critical value can be estimated. Reactions taking different paths, forming different products such as N{sub 2}O in place of N{sub 2} are also considered, as are reactions where an excess of caustic is present. Oxidants other than nitrate and nitrite are considered briefly.

  14. On Medium Chemical Reaction in Diffusion-Based Molecular Communication: a Two-Way Relaying Example

    OpenAIRE

    Farahnak-Ghazani, Maryam; Aminian, Gholamali; Mirmohseni, Mahtab; Gohari, Amin; Nasiri-Kenari, Masoumeh

    2016-01-01

    Chemical reactions are a prominent feature of molecular communication (MC) systems, with no direct parallels in wireless communications. While chemical reactions may be used inside the transmitter nodes, receiver nodes or the communication medium, we focus on its utility in the medium in this paper. Such chemical reactions can be used to perform computation over the medium as molecules diffuse and react with each other (physical-layer computation). We propose the use of chemical reactions for...

  15. On-line computing in a classified environment

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.

    1982-01-01

    Westinghouse Hanford Company (WHC) recently developed a Department of Energy (DOE) approved real-time, on-line computer system to control nuclear material. The system simultaneously processes both classified and unclassified information. Implementation of this system required application of many security techniques. The system has a secure, but user friendly interface. Many software applications protect the integrity of the data base from malevolent or accidental errors. Programming practices ensure the integrity of the computer system software. The audit trail and the reports generation capability record user actions and status of the nuclear material inventory

  16. Advances in quantum and molecular mechanical (QM/MM) simulations for organic and enzymatic reactions.

    Science.gov (United States)

    Acevedo, Orlando; Jorgensen, William L

    2010-01-19

    Application of combined quantum and molecular mechanical (QM/MM) methods focuses on predicting activation barriers and the structures of stationary points for organic and enzymatic reactions. Characterization of the factors that stabilize transition structures in solution and in enzyme active sites provides a basis for design and optimization of catalysts. Continued technological advances allowed for expansion from prototypical cases to mechanistic studies featuring detailed enzyme and condensed-phase environments with full integration of the QM calculations and configurational sampling. This required improved algorithms featuring fast QM methods, advances in computing changes in free energies including free-energy perturbation (FEP) calculations, and enhanced configurational sampling. In particular, the present Account highlights development of the PDDG/PM3 semi-empirical QM method, computation of multi-dimensional potentials of mean force (PMF), incorporation of on-the-fly QM in Monte Carlo (MC) simulations, and a polynomial quadrature method for efficient modeling of proton-transfer reactions. The utility of this QM/MM/MC/FEP methodology is illustrated for a variety of organic reactions including substitution, decarboxylation, elimination, and pericyclic reactions. A comparison to experimental kinetic results on medium effects has verified the accuracy of the QM/MM approach in the full range of solvents from hydrocarbons to water to ionic liquids. Corresponding results from ab initio and density functional theory (DFT) methods with continuum-based treatments of solvation reveal deficiencies, particularly for protic solvents. Also summarized in this Account are three specific QM/MM applications to biomolecular systems: (1) a recent study that clarified the mechanism for the reaction of 2-pyrone derivatives catalyzed by macrophomate synthase as a tandem Michael-aldol sequence rather than a Diels-Alder reaction, (2) elucidation of the mechanism of action of fatty

  17. Aerothermodynamic Environments Definition for the Mars Science Laboratory Entry Capsule

    Science.gov (United States)

    Edquist, Karl T.; Dyakonov, Artem A.; Wright, Michael J.; Tang, Chun Y.

    2007-01-01

    An overview of the aerothermodynamic environments definition status is presented for the Mars Science Laboratory entry vehicle. The environments are based on Navier-Stokes flowfield simulations on a candidate aeroshell geometry and worst-case entry heating trajectories. Uncertainties for the flowfield predictions are based primarily on available ground data since Mars flight data are scarce. The forebody aerothermodynamics analysis focuses on boundary layer transition and turbulent heating augmentation. Turbulent transition is expected prior to peak heating, a first for Mars entry, resulting in augmented heat flux and shear stress at the same heatshield location. Afterbody computations are also shown with and without interference effects of reaction control system thruster plumes. Including uncertainties, analysis predicts that the heatshield may experience peaks of 225 W/sq cm for turbulent heat flux, 0.32 atm for stagnation pressure, and 400 Pa for turbulent shear stress. The afterbody heat flux without thruster plume interference is predicted to be 7 W/sq cm on the backshell and 10 W/sq cm on the parachute cover. If the reaction control jets are fired near peak dynamic pressure, the heat flux at localized areas could reach as high as 76 W/sq cm on the backshell and 38 W/sq cm on the parachute cover, including uncertainties. The final flight environments used for hardware design will be updated for any changes in the aeroshell configuration, heating design trajectories, or uncertainties.

  18. The Effectiveness of Self-Regulated Learning Scaffolds on Academic Performance in Computer-Based Learning Environments: A Meta-Analysis

    Science.gov (United States)

    Zheng, Lanqin

    2016-01-01

    This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…

  19. Measurements by activation foils and comparative computations by MCNP code

    International Nuclear Information System (INIS)

    Kyncl, J.

    2008-01-01

    Systematic study of the radioactive waste minimisation problem is subject of the SPHINX project. Its idea is that burning or transmutation of the waste inventory problematic part will be realized in a nuclear reactor the fuel of which is in the form of liquid fluorides. In frame of the project, several experiments have been performed with so-called inserted experimental channel. The channel was filled up by the fluorides mixture, surrounded by six fuel assemblies with moderator and placed into LR-0 reactor vessel. This formation was brought to critical state and measurement with activation foil detectors were carried out at selected positions of the inserted channel. Main aim of the measurements was to determine reaction rates for the detectors mentioned. For experiment evaluation, comparative computations were accomplished by code MCNP4a. The results obtained show that very often, computed values of reaction rates differ substantially from the values that were obtained from the experiment. This contribution deals with analysis of the reasons of these differences from the point of view of computations by Monte Carlo method. The analysis of concrete cases shows that the inaccuracy of reaction rate computed is caused mostly by three circumstances:-space region that is occupied by detector is relatively very small;- microscopic effective cross-section R(E) of the reaction changes strongly with energy just in the energy interval that gives the greatest contribution to the reaction; - in the energy interval that gives the greatest contribution to reaction rate, the error of the computed neutron flux is great. These circumstances evoke that the computation of reaction rate with casual accuracy submits extreme demands on computing time. (Author)

  20. de novo computational enzyme design.

    Science.gov (United States)

    Zanghellini, Alexandre

    2014-10-01

    Recent advances in systems and synthetic biology as well as metabolic engineering are poised to transform industrial biotechnology by allowing us to design cell factories for the sustainable production of valuable fuels and chemicals. To deliver on their promises, such cell factories, as much as their brick-and-mortar counterparts, will require appropriate catalysts, especially for classes of reactions that are not known to be catalyzed by enzymes in natural organisms. A recently developed methodology, de novo computational enzyme design can be used to create enzymes catalyzing novel reactions. Here we review the different classes of chemical reactions for which active protein catalysts have been designed as well as the results of detailed biochemical and structural characterization studies. We also discuss how combining de novo computational enzyme design with more traditional protein engineering techniques can alleviate the shortcomings of state-of-the-art computational design techniques and create novel enzymes with catalytic proficiencies on par with natural enzymes. Copyright © 2014 Elsevier Ltd. All rights reserved.