WorldWideScience

Sample records for computer simulation analysis

  1. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Computer simulation of ion beam analysis of laterally inhomogeneous materials

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, M.

    2016-03-15

    The program STRUCTNRA for the simulation of ion beam analysis charged particle spectra from arbitrary two-dimensional distributions of materials is described. The code is validated by comparison to experimental backscattering data from a silicon grating on tantalum at different orientations and incident angles. Simulated spectra for several types of rough thin layers and a chessboard-like arrangement of materials as example for a multi-phase agglomerate material are presented. Ambiguities between back-scattering spectra from two-dimensional and one-dimensional sample structures are discussed.

  4. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  5. Wavelet analysis on paleomagnetic (and computer simulated VGP time series

    Directory of Open Access Journals (Sweden)

    A. Siniscalchi

    2003-06-01

    Full Text Available We present Continuous Wavelet Transform (CWT data analysis of Virtual Geomagnetic Pole (VGP latitude time series. The analyzed time series are sedimentary paleomagnetic and geodynamo simulated data. Two mother wavelets (the Morlet function and the first derivative of a Gaussian function are used in order to detect features related to the spectral content as well as polarity excursions and reversals. By means of the Morlet wavelet, we estimate both the global spectrum and the time evolution of the spectral content of the paleomagnetic data series. Some peaks corresponding to the orbital components are revealed by the spectra and the local analysis helped disclose their statistical significance. Even if this feature could be an indication of orbital influence on geodynamo, other interpretations are possible. In particular, we note a correspondence of local spectral peaks with the appearance of the excursions in the series. The comparison among the paleomagnetic and simulated spectra shows a similarity in the high frequency region indicating that their degree of regularity is analogous. By means of Gaussian first derivative wavelet, reversals and excursions of polarity were sought. The analysis was performed first on the simulated data, to have a guide in understanding the features present in the more complex paleomagnetic data. Various excursions and reversals have been identified, despite of the prevalent normality of the series and its inherent noise. The found relative chronology of the paleomagnetic data reversals was compared with a coeval global polarity time scale (Channel et al., 1995. The relative lengths of polarity stability intervals are found similar, but a general shift appears between the two scales, that could be due to the datation uncertainties of the Hauterivian/Barremian boundary.

  6. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  7. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  8. Analysis of the Utilization of Machinery in the Production Process Using Computer Simulation

    Directory of Open Access Journals (Sweden)

    Fedorko Gabriel

    2017-01-01

    Full Text Available For the efficient operation of each production process, individual machines and equipment must be used in the maximal possible measure. For this reason, it is necessary to know their utilization and to take measures to ensure their effective use. For performing such an analysis and to design subsequent measures, the use of the computer simulation method is very effective, e.g. simulation in Tecnomatix Plant Simulation program.

  9. A handheld computer-aided diagnosis system and simulated analysis

    Science.gov (United States)

    Su, Mingjian; Zhang, Xuejun; Liu, Brent; Su, Kening; Louie, Ryan

    2016-03-01

    This paper describes a Computer Aided Diagnosis (CAD) system based on cellphone and distributed cluster. One of the bottlenecks in building a CAD system for clinical practice is the storage and process of mass pathology samples freely among different devices, and normal pattern matching algorithm on large scale image set is very time consuming. Distributed computation on cluster has demonstrated the ability to relieve this bottleneck. We develop a system enabling the user to compare the mass image to a dataset with feature table by sending datasets to Generic Data Handler Module in Hadoop, where the pattern recognition is undertaken for the detection of skin diseases. A single and combination retrieval algorithm to data pipeline base on Map Reduce framework is used in our system in order to make optimal choice between recognition accuracy and system cost. The profile of lesion area is drawn by doctors manually on the screen, and then uploads this pattern to the server. In our evaluation experiment, an accuracy of 75% diagnosis hit rate is obtained by testing 100 patients with skin illness. Our system has the potential help in building a novel medical image dataset by collecting large amounts of gold standard during medical diagnosis. Once the project is online, the participants are free to join and eventually an abundant sample dataset will soon be gathered enough for learning. These results demonstrate our technology is very promising and expected to be used in clinical practice.

  10. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  11. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  12. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  13. Features of development and analysis of the simulation model of a multiprocessor computer system

    Directory of Open Access Journals (Sweden)

    O. M. Brekhov

    2017-01-01

    Full Text Available Over the past decade, multiprocessor systems have been applied in computer technology. At present,multi-core processors are equipped not only with supercomputers, but also with the vast majority of mobile devices. This creates the need for students to learn the basic principles of their construction and functioning.One of the possible methods for analyzing the operation of multiprocessor systems is simulation modeling.Its use contributes to a better understanding of the effect of workload and structure parameters on performance. The article considers the features of the development of the simulation model for estimating the time characteristics of a multiprocessor computer system, as well as the use of the regenerative method of model analysis. The characteristics of the software implementation of the inverse kinematics solution of the robot are adopted as a workload. The given task consists in definition of turns in joints of the manipulator on known angular and linear position of its grasp. An analytical algorithm for solving the problem was chosen, namely, the method of simple kinematic relations. The work of the program is characterized by the presence of parallel calculations, during which resource conflicts arise between the processor cores, involved in simultaneous access to the memory via a common bus. In connection with the high information connectivity between parallel running programs, it is assumed that all processing cores use shared memory. The simulation model takes into account probabilistic memory accesses and tracks emerging queues to shared resources. The collected statistics reveal the productive and overhead time costs for the program implementation for each processor core involved. The simulation results show the unevenness of kernel utilization, downtime in queues to shared resources and temporary losses while waiting for other cores due to information dependencies. The results of the simulation are estimated by the

  14. Computer Simulation and Data Analysis in Molecular Biology and Biophysics An Introduction Using R

    CERN Document Server

    Bloomfield, Victor

    2009-01-01

    This book provides an introduction, suitable for advanced undergraduates and beginning graduate students, to two important aspects of molecular biology and biophysics: computer simulation and data analysis. It introduces tools to enable readers to learn and use fundamental methods for constructing quantitative models of biological mechanisms, both deterministic and with some elements of randomness, including complex reaction equilibria and kinetics, population models, and regulation of metabolism and development; to understand how concepts of probability can help in explaining important features of DNA sequences; and to apply a useful set of statistical methods to analysis of experimental data from spectroscopic, genomic, and proteomic sources. These quantitative tools are implemented using the free, open source software program R. R provides an excellent environment for general numerical and statistical computing and graphics, with capabilities similar to Matlab®. Since R is increasingly used in bioinformat...

  15. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    OpenAIRE

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source three additional factors are varied, namely the velocity, details of the computer simulated person, and the CFD model of the wind channel. The personal exposure is found to be highly dependent on the r...

  16. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  17. Neuromechanic: a computational platform for simulation and analysis of the neural control of movement

    Science.gov (United States)

    Bunderson, Nathan E.; Bingham, Jeffrey T.; Sohn, M. Hongchul; Ting, Lena H.; Burkholder, Thomas J.

    2015-01-01

    Neuromusculoskeletal models solve the basic problem of determining how the body moves under the influence of external and internal forces. Existing biomechanical modeling programs often emphasize dynamics with the goal of finding a feed-forward neural program to replicate experimental data or of estimating force contributions or individual muscles. The computation of rigid-body dynamics, muscle forces, and activation of the muscles are often performed separately. We have developed an intrinsically forward computational platform (Neuromechanic, www.neuromechanic.com) that explicitly represents the interdependencies among rigid body dynamics, frictional contact, muscle mechanics, and neural control modules. This formulation has significant advantages for optimization and forward simulation, particularly with application to neural controllers with feedback or regulatory features. Explicit inclusion of all state dependencies allows calculation of system derivatives with respect to kinematic states as well as muscle and neural control states, thus affording a wealth of analytical tools, including linearization, stability analyses and calculation of initial conditions for forward simulations. In this review, we describe our algorithm for generating state equations and explain how they may be used in integration, linearization and stability analysis tools to provide structural insights into the neural control of movement. PMID:23027632

  18. Performance evaluation using SYSTID time domain simulation. [computer-aid design and analysis for communication systems

    Science.gov (United States)

    Tranter, W. H.; Ziemer, R. E.; Fashano, M. J.

    1975-01-01

    This paper reviews the SYSTID technique for performance evaluation of communication systems using time-domain computer simulation. An example program illustrates the language. The inclusion of both Gaussian and impulse noise models make accurate simulation possible in a wide variety of environments. A very flexible postprocessor makes possible accurate and efficient performance evaluation.

  19. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    DEFF Research Database (Denmark)

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source t...

  20. Plutonium Worlds. Fast Breeders, Systems Analysis and Computer Simulation in the Age of Hypotheticality

    Directory of Open Access Journals (Sweden)

    Sebastian Vehlken

    2014-09-01

    Full Text Available This article examines the media history of one of the hallmark civil nuclear energy programs in Western Germany – the development of Liquid Metal Fast Breeder Reactor (LMFBR technology. Promoted as a kind of perpetuum mobile of the Atomic Age, the "German Manhattan Project" not only imported big science thinking. In its context, nuclear technology was also put forth as an avantgarde of scientific inquiry, dealing with the most complex and critical technological endeavors. In the face of the risks of nuclear technology, German physicist Wolf Häfele thus announced a novel epistemology of "hypotheticality". In a context where traditional experimental engineering strategies became inappropiate, he called for the application of advanced media technologies: Computer Simulations (CS and Systems Analysis (SA generated computerized spaces for the production of knowledge. In the course of the German Fast Breeder program, such methods had a twofold impact. One the one hand, Häfele emphazised – as the "father of the German Fast Breeder" – the utilization of CS for the actual planning and construction of the novel reactor type. On the other, namely as the director of the department of Energy Systems at the International Institute for Applied Systems Analysis (IIASA, Häfele advised SA-based projections of energy consumption. These computerized scenarios provided the rationale for the conception of Fast Breeder programs as viable and necessary alternative energy sources in the first place. By focusing on the role of the involved CS techniques, the paper thus investigates the intertwined systems thinking of nuclear facilities’s planning and construction and the design of large-scale energy consumption and production scenarios in the 1970s and 1980s, as well as their conceptual afterlives in our contemporary era of computer simulation.

  1. Applications of computer simulation, nuclear reactions and elastic scattering to surface analysis of materials

    Directory of Open Access Journals (Sweden)

    Pacheco de Carvalho, J. A.

    2008-08-01

    Full Text Available This article involves computer simulation and surface analysis by nuclear techniques, which are non-destructive. Both the “energy method of analysis” for nuclear reactions and elastic scattering are used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. The method is successfully applied to thick flat targets of graphite, quartz and sapphire and targets containing thin films of aluminium oxide. Depth profiles of 12C and 16O nuclei are determined using (d,p and (d,α deuteron induced reactions. Rutherford and resonance elastic scattering of (4He+ ions are also used.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energia” para reacciones nucleares, así como el de difusión elástica. Se simulan en ordenador espectros en energía que se comparan com datos experimentales, de lo que resulta la obención de información sobre la composición y los perfiles de concentración de la muestra. Este método se aplica con éxito em muestras espesas y planas de grafito, cuarzo y zafiro y muestras conteniendo películas finas de óxido de aluminio. Se calculan perfiles en profundidad de núcleos de 12C y de 16O a través de reacciones (d,p y (d,α inducidas por deuterones. Se utiliza también la difusión elástica de iones (4He+, tanto a Rutherford como resonante.

  2. Three-dimensional computer simulation of radiostereometric analysis (RSA) in distal radius fractures.

    Science.gov (United States)

    Madanat, Rami; Moritz, Niko; Aro, Hannu T

    2007-01-01

    Physical phantom models have conventionally been used to determine the accuracy and precision of radiostereometric analysis (RSA) in various orthopaedic applications. Using a phantom model of a fracture of the distal radius it has previously been shown that RSA is a highly accurate and precise method for measuring both translation and rotation in three-dimensions (3-D). The main shortcoming of a physical phantom model is its inability to mimic complex 3-D motion. The goal of this study was to create a realistic computer model for preoperative planning of RSA studies and to test the accuracy of RSA in measuring complex movements in fractures of the distal radius using this new model. The 3-D computer model was created from a set of tomographic scans. The simulation of the radiographic imaging was performed using ray-tracing software (POV-Ray). RSA measurements were performed according to standard protocol. Using a two-part fracture model (AO/ASIF type A2), it was found that for simple movements in one axis, translations in the range of 25microm-2mm could be measured with an accuracy of +/-2microm. Rotations ranging from 16 degrees to 2 degrees could be measured with an accuracy of +/-0.015 degrees . Using a three-part fracture model the corresponding values of accuracy were found to be +/-4microm and +/-0.031 degrees for translation and rotation, respectively. For complex 3-D motion in a three-part fracture model (AO/ASIF type C1) the accuracy was +/-6microm for translation and +/-0.120 degrees for rotation. The use of 3-D computer modelling can provide a method for preoperative planning of RSA studies in complex fractures of the distal radius and in other clinical situations in which the RSA method is applicable.

  3. Computer simulation analysis of normal and abnormal development of the mammalian diaphragm

    Directory of Open Access Journals (Sweden)

    Bodenstein Lawrence

    2006-02-01

    Full Text Available Abstract Background Congenital diaphragmatic hernia (CDH is a birth defect with significant morbidity and mortality. Knowledge of diaphragm morphogenesis and the aberrations leading to CDH is limited. Although classical embryologists described the diaphragm as arising from the septum transversum, pleuroperitoneal folds (PPF, esophageal mesentery and body wall, animal studies suggest that the PPF is the major, if not sole, contributor to the muscular diaphragm. Recently, a posterior defect in the PPF has been identified when the teratogen nitrofen is used to induce CDH in fetal rodents. We describe use of a cell-based computer modeling system (Nudge++™ to study diaphragm morphogenesis. Methods and results Key diaphragmatic structures were digitized from transverse serial sections of paraffin-embedded mouse embryos at embryonic days 11.5 and 13. Structure boundaries and simulated cells were combined in the Nudge++™ software. Model cells were assigned putative behavioral programs, and these programs were progressively modified to produce a diaphragm consistent with the observed anatomy in rodents. Homology between our model and recent anatomical observations occurred under the following simulation conditions: (1 cell mitoses are restricted to the edge of growing tissue; (2 cells near the chest wall remain mitotically active; (3 mitotically active non-edge cells migrate toward the chest wall; and (4 movement direction depends on clonal differentiation between anterior and posterior PPF cells. Conclusion With the PPF as the sole source of mitotic cells, an early defect in the PPF evolves into a posteromedial diaphragm defect, similar to that of the rodent nitrofen CDH model. A posterolateral defect, as occurs in human CDH, would be more readily recreated by invoking other cellular contributions. Our results suggest that recent reports of PPF-dominated diaphragm morphogenesis in the rodent may not be strictly applicable to man. The ability to

  4. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research.

    Science.gov (United States)

    Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter

    2014-11-28

    Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.

  5. A Comparative Analysis of Student Learning with a Collaborative Computer Simulation of the Cardiopulmonary System

    Science.gov (United States)

    Keyser, Diane

    2010-01-01

    To design a series of assessments that could be used to compare the learning gains of high school students studying the cardiopulmonary system using traditional methods to those who used a collaborative computer simulation, called "Mr. Vetro". Five teachers and 264 HS biology students participated in the study. The students were in…

  6. A tool for the morphological analysis of mixtures of lipids and water in computer simulations

    NARCIS (Netherlands)

    Fuhrmans, Marc; Marrink, Siewert-Jan

    When analyzing computer simulations of mixtures of lipids and water, the questions to be answered are often of a morphological nature. They can deal with global properties, like the kind of phase that is adopted or the presence or absence of certain key features like a pore or stalk, or with local

  7. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin Nasaruddin

    2013-09-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  8. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin

    2009-11-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  9. Grid connected integrated community energy system. Phase II: final state 2 report. Cost benefit analysis, operating costs and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-22

    A grid-connected Integrated Community Energy System (ICES) with a coal-burning power plant located on the University of Minnesota campus is planned. The cost benefit analysis performed for this ICES, the cost accounting methods used, and a computer simulation of the operation of the power plant are described. (LCL)

  10. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  11. Quantitative Analysis of Accuracy of Voidage Computations in CFD-DEM Simulations

    Directory of Open Access Journals (Sweden)

    H. A. Khawaja

    2012-06-01

    Full Text Available CFD-DEM (Computational Fluid Dynamics – Discrete Element Modelling is a two-phase flow numerical modelling technique, where the Eulerian method is used for the fluid and the Lagrangian method for the particles. The two phases are coupled by a fluid-particle interaction force (i.e. drag force which is computed using a correlation. In a two-phase flow, one critical parameter is the voidage (or void fraction, which is defined as the ratio of the volume occupied by the fluid to the total volume. In a CFD-DEM simulation the local voidage is computed by calculating the volume of particles in a given fluid cell. For spherical particles, this computation is difficult when a particle is on the boundary of fluid cells. In this case, it is usual to compute the volume of a particle in a fluid cell approximately. One such approximation divides the volume of a particle into each cell in the same ratio as an equivalent cube of width equal to the particle diameter. Whilst this approach is computationally straight forward, the approximation introduces an error in the voidage computation. Here we estimate the error by comparing the approximate volume calculation with an exact (numerical computation of the volume of a particle in a fluid cell. The results show that the error varies with the position of the particle relative to the cell boundary. A new approach is suggested which limits the error to less than 2.5 %, without significantly increasing the computational complexity.

  12. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  13. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  14. Challenges to Computational Aerothermodynamic Simulation and Validation for Planetary Entry Vehicle Analysis

    Science.gov (United States)

    Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2010-01-01

    Challenges to computational aerothermodynamic (CA) simulation and validation of hypersonic flow over planetary entry vehicles are discussed. Entry, descent, and landing (EDL) of high mass to Mars is a significant driver of new simulation requirements. These requirements include simulation of large deployable, flexible structures and interactions with reaction control system (RCS) and retro-thruster jets. Simulation of radiation and ablation coupled to the flow solver continues to be a high priority for planetary entry analyses, especially for return to Earth and outer planet missions. Three research areas addressing these challenges are emphasized. The first addresses the need to obtain accurate heating on unstructured tetrahedral grid systems to take advantage of flexibility in grid generation and grid adaptation. A multi-dimensional inviscid flux reconstruction algorithm is defined that is oriented with local flow topology as opposed to grid. The second addresses coupling of radiation and ablation to the hypersonic flow solver--flight- and ground-based data are used to provide limited validation of these multi-physics simulations. The third addresses the challenges of retro-propulsion simulation and the criticality of grid adaptation in this application. The evolution of CA to become a tool for innovation of EDL systems requires a successful resolution of these challenges.

  15. Virtual Environment Computer Simulations to Support Human Factors Engineering and Operations Analysis for the RLV Program

    Science.gov (United States)

    Lunsford, Myrtis Leigh

    1998-01-01

    The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.

  16. Computer simulation tools for X-ray analysis scattering and diffraction methods

    CERN Document Server

    Morelhão, Sérgio Luiz

    2016-01-01

    The main goal of this book is to break down the huge barrier of difficulties faced by beginners from many fields (Engineering, Physics, Chemistry, Biology, Medicine, Material Science, etc.) in using X-rays as an analytical tool in their research. Besides fundamental concepts, MatLab routines are provided, showing how to test and implement the concepts. The major difficult in analyzing materials by X-ray techniques is that it strongly depends on simulation software. This book teaches the users on how to construct a library of routines to simulate scattering and diffraction by almost any kind of samples. It provides to a young student the knowledge that would take more than 20 years to acquire by working on X-rays and relying on the available textbooks. In this book, fundamental concepts in applied X-ray physics are demonstrated through available computer simulation tools. Using MatLab, more than eighty routines are developed for solving the proposed exercises, most of which can be directly used in experimental...

  17. Reversible simulation of irreversible computation

    Science.gov (United States)

    Li, Ming; Tromp, John; Vitányi, Paul

    1998-09-01

    Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.

  18. Efficient SDH Computation In Molecular Simulations Data.

    Science.gov (United States)

    Tu, Yi-Cheng; Chen, Shaoping; Pandit, Sagar; Kumar, Anand; Grupcev, Vladimir

    2012-10-01

    Analysis of large particle or molecular simulation data is integral part of the basic-science research community. It often involves computing functions such as point-to-point interactions of particles. Spatial distance histogram (SDH) is one such vital computation in scientific discovery. SDH is frequently used to compute Radial Distribution Function (RDF), and it takes quadratic time to compute using naive approach. Naive SDH computation is even more expensive as it is computed continuously over certain period of time to analyze simulation systems. Tree-based SDH computation is a popular approach. In this paper we look at different tree-based SDH computation techniques and briefly discuss about their performance. We present different strategies to improve the performance of these techniques. Specifically, we study the density map (DM) based SDH computation techniques. A DM is essentially a grid dividing simulated space into cells (3D cubes) of equal size (volume), which can be easily implemented by augmenting a Quad-tree (or Oct-tree) index. DMs are used in various configurations to compute SDH continuously over snapshots of the simulation system. The performance improvements using some of these configurations is presented in this paper. We also present the effect of utilizing computation power of Graphics Processing Units (GPUs) in computing SDH.

  19. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  20. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  1. Diagnosis of Combined Cycle Power Plant Based on Thermoeconomic Analysis: A Computer Simulation Study

    Directory of Open Access Journals (Sweden)

    Hoo-Suk Oh

    2017-11-01

    Full Text Available In this study, diagnosis of a 300-MW combined cycle power plant under faulty conditions was performed using a thermoeconomic method called modified productive structure analysis. The malfunction and dysfunction, unit cost of irreversibility and lost cost flow rate for each component were calculated for the cases of pre-fixed malfunction and the reference conditions. A commercial simulating software, GateCycleTM (version 6.1.2, was used to estimate the thermodynamic properties under faulty conditions. The relative malfunction (RMF and the relative difference in the lost cost flow rate between real operation and reference conditions (RDLC were found to be effective indicators for the identification of faulty components. Simulation results revealed that 0.5% degradation in the isentropic efficiency of air compressor, 2% in gas turbine, 2% in steam turbine and 2% degradation in energy loss in heat exchangers can be identified. Multi-fault scenarios that can be detected by the indicators were also considered. Additional lost exergy due to these types of faulty components, that can be detected by RMF or RDLC, is less than 5% of the exergy lost in the components in the normal condition.

  2. Finite element analysis of TAVI: Impact of native aortic root computational modeling strategies on simulation outcomes.

    Science.gov (United States)

    Finotello, Alice; Morganti, Simone; Auricchio, Ferdinando

    2017-09-01

    In the last few years, several studies, each with different aim and modeling detail, have been proposed to investigate transcatheter aortic valve implantation (TAVI) with finite elements. The present work focuses on the patient-specific finite element modeling of the aortic valve complex. In particular, we aim at investigating how different modeling strategies in terms of material models/properties and discretization procedures can impact analysis results. Four different choices both for the mesh size (from  20 k elements to  200 k elements) and for the material model (from rigid to hyperelastic anisotropic) are considered. Different approaches for modeling calcifications are also taken into account. Post-operative CT data of the real implant are used as reference solution with the aim of outlining a trade-off between computational model complexity and reliability of the results. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  3. Processing experimental data and analysis of simulation codes from Nuclear Physics using distributed and parallel computing

    CERN Document Server

    Niculescu, Mihai; Hristov, Peter

    In this thesis we tried to show the impact of new technologies on scientific work in the large field of heavy ion physics and as a case study, we present the implementation of the event plane method, on a highly parallel technology: the graphic processor. By the end of the thesis, a comparison of the analysis results with the elliptic flow published by ALICE is made. In Chapter 1 we presented the computing needs at the heavy ion physics experiment ALICE and showed the current state of software and technologies. The new technologies available for some time, Chapter 2, present new performance capabilities and generated a trend in preparing for the new wave of technologies and software, which most indicators show will dominate the future. This was not disregarded by the scientific community and in consequence section 2.2 shows the rising interest in the new technologies by the High Energy Physics community. A real case study was needed to better understand how the new technologies can be applied in HEP and aniso...

  4. Product wastage from modern human growth hormone administration devices: a laboratory and computer simulation analysis

    Directory of Open Access Journals (Sweden)

    Pollock RF

    2013-08-01

    Full Text Available Richard F Pollock,1 Yujun Qian,2 Tami Wisniewski,3 Lisa Seitz,4 Anne-Marie Kappelgaard2 1Ossian Health Economics and Communications GmbH, Basel, Switzerland; 2Novo Nordisk AS, Bagsværd, Denmark; 3Novo Nordisk Inc, Princeton, NJ, USA; 4Novo Nordisk Pharma GmbH, Mainz, Germany Background: Treatment of growth hormone disorders typically involves daily injections of human growth hormone (GH over many years, incurring substantial costs. We assessed the extent of undesired GH loss due to leakage in the course of pen preparation prior to injection, and differences between the prescribed dose, based on patient weight, and the actual delivered dose based on pen dosing increments in five GH administration devices. Methods: Norditropin® prefilled FlexPro®, NordiFlex®, NordiLet®, and durable NordiPen®/SimpleXx® 5 mg pens (Novo Nordisk A/S, Bagsværd, Denmark and durable Omnitrope® Pen-5 devices (Sandoz, Holzkirchen, Germany were tested (n = 40 for each device type. Product wastage was measured in accordance with validated protocols in an ISO (International Organization for Standardization 11608-1 and Good Manufacturing Practice compliant laboratory. The average mass of wasted GH from each device type was measured in simulations of dripping with the needle attached prior to injection and while setting a dose. Statistical significance (P < 0.05 was confirmed by Student's t-test, and a model was constructed to estimate mean annual GH wastage per patient in cohorts of pediatric patients with GH disorders. Results: Mean GH mass wasted with the needle on prior to injection was 0.0 µg with Norditropin pens, relative to 98 µg with Omnitrope Pen-5. During dose dialing, 0.0–2.3 µg of GH was lost with Norditropin pens versus 0.8 µg with Omnitrope Pen-5. All Norditropin and Omnitrope device comparisons were statistically significant. Modeling GH wastage in a US cohort showed 5.5 mg of annual GH wastage per patient with FlexPro versus 43.6 mg with

  5. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  6. Performance evaluation by simulation and analysis with applications to computer networks

    CERN Document Server

    Chen, Ken

    2015-01-01

    This book is devoted to the most used methodologies for performance evaluation: simulation using specialized software and mathematical modeling. An important part is dedicated to the simulation, particularly in its theoretical framework and the precautions to be taken in the implementation of the experimental procedure.  These principles are illustrated by concrete examples achieved through operational simulation languages ​​(OMNeT ++, OPNET). Presented under the complementary approach, the mathematical method is essential for the simulation. Both methodologies based largely on the theory of

  7. Social interaction, globalization and computer-aided analysis a practical guide to developing social simulation

    CERN Document Server

    Osherenko, Alexander

    2014-01-01

    This thorough, multidisciplinary study discusses the findings of social interaction and social simulation using understandable global examples. Shows the reader how to acquire intercultural data, illustrating each step with descriptive comments and program code.

  8. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  9. Computational Simulation and Analysis of Mutations: Nucleotide Fixation, Allelic Age and Rare Genetic Variations in Population

    Science.gov (United States)

    Qiu, Shuhao

    2015-01-01

    In order to investigate the complexity of mutations, a computational approach named Genome Evolution by Matrix Algorithms ("GEMA") has been implemented. GEMA models genomic changes, taking into account hundreds of mutations within each individual in a population. By modeling of entire human chromosomes, GEMA precisely mimics real…

  10. Evolutionary Games and Computer Simulations

    CERN Document Server

    Huberman, B A; Huberman, Bernardo A.; Glance, Natalie S.

    1993-01-01

    Abstract: The prisoner's dilemma has long been considered the paradigm for studying the emergence of cooperation among selfish individuals. Because of its importance, it has been studied through computer experiments as well as in the laboratory and by analytical means. However, there are important differences between the way a system composed of many interacting elements is simulated by a digital machine and the manner in which it behaves when studied in real experiments. In some instances, these disparities can be marked enough so as to cast doubt on the implications of cellular automata type simulations for the study of cooperation in social systems. In particular, if such a simulation imposes space-time granularity, then its ability to describe the real world may be compromised. Indeed, we show that the results of digital simulations regarding territoriality and cooperation differ greatly when time is discrete as opposed to continuous.

  11. Manned systems utilization analysis (study 2.1). Volume 3: LOVES computer simulations, results, and analyses

    Science.gov (United States)

    Stricker, L. T.

    1975-01-01

    The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.

  12. Fourier Analysis: Creating A “Virtual Laboratory” Using Computer Simulation

    Directory of Open Access Journals (Sweden)

    Jeff Butterfield

    1998-01-01

    Full Text Available At times the desire for specialized laboratory apparatus to support class activities outstrips the available resources.  When this is the case the instructor must look for creative alternatives to help meet the desired objectives.  This report examines how a virtual laboratory was created to model and analyze high-speed networking signals in a LAN class using a spreadsheet simulation.  The students were able to printout various waveforms (e.g., signals of different frequencies/network media that are similar to output from test equipment that would have otherwise been cost prohibitive.  The activity proved to be valuable in helping students to understand an otherwise difficult concept that is central to modern networking applications.  Such simulation is not limited to network signals, but may be applicable in many situations where the artifact under study may be described mathematically.

  13. Computer simulation of martensitic transformations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ping [Univ. of California, Berkeley, CA (United States)

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  14. DEVELOPMENT BY COMPUTATIONAL SIMULATION AND PERFORMANCE ANALYSIS OF AN EQUAL CHANNEL ANGULAR PRESSING DIE

    Directory of Open Access Journals (Sweden)

    Phillip Springer

    2013-06-01

    Full Text Available Critical geometric parameters of an Equal Channel Angular Pressing (ECAP die suitable to plate processing were optimized by making use of the DEFORM™ software. Following the simulation a die was manufactured and employed in the processing of 7 mm thick Al AA 1050 plates. Software output included the pressing forces and the equivalent deformation distribution within the plates, after one and four ECAP passes. Calculated pressing forces against the punch displacement were compared with the actual forces, whilst the deformation distribution is validated by Vickers microhardness measurements. From tensile tests and microstructural observation of the processed plates the die performance was found quite satisfactory.

  15. Computer Simulations of Space Plasmas

    Science.gov (United States)

    Goertz, C. K.

    Even a superficial scanning of the latest issues of the Journal of Geophysical Research reveals that numerical simulation of space plasma processes is an active and growing field. The complexity and sophistication of numerically produced “data” rivals that of the real stuff. Sometimes numerical results need interpretation in terms of a simple “theory,” very much as the results of real experiments and observations do. Numerical simulation has indeed become a third independent tool of space physics, somewhere between observations and analytic theory. There is thus a strong need for textbooks and monographs that report the latest techniques and results in an easily accessible form. This book is an attempt to satisfy this need. The editors want it not only to be “proceedings of selected lectures (given) at the first ISSS (International School of Space Simulations in Kyoto, Japan, November 1-2, 1982) but rather…a form of textbook of computer simulations of space plasmas.” This is, of course, a difficult task when many authors are involved. Unavoidable redundancies and differences in notation may confuse the beginner. Some important questions, like numerical stability, are not discussed in sufficient detail. The recent book by C.K. Birdsall and A.B. Langdon (Plasma Physics via Computer Simulations, McGraw-Hill, New York, 1985) is more complete and detailed and seems more suitable as a textbook for simulations. Nevertheless, this book is useful to the beginner and the specialist because it contains not only descriptions of various numerical techniques but also many applications of simulations to space physics phenomena.

  16. Computer simulation of electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Sabchevski, S.P.; Mladenov, G.M. (Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. po Elektronika)

    1994-04-14

    Self-fields and forces as well as the local degree of space-charge neutralization in overcompensated electron beams are considered. The radial variation of the local degree of space-charge neutralization is analysed. A novel model which describes the equilibrium potential distribution in overcompensated beams is proposed and a method for computer simulation of the beam propagation is described. Results from numerical experiments which illustrate the propagation of finite emittance overneutralized beams are presented. (Author).

  17. Computer simulation of nonequilibrium processes

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  18. Gate-error analysis in simulations of quantum computers with transmon qubits

    Science.gov (United States)

    Willsch, D.; Nocon, M.; Jin, F.; De Raedt, H.; Michielsen, K.

    2017-12-01

    In the model of gate-based quantum computation, the qubits are controlled by a sequence of quantum gates. In superconducting qubit systems, these gates can be implemented by voltage pulses. The success of implementing a particular gate can be expressed by various metrics such as the average gate fidelity, the diamond distance, and the unitarity. We analyze these metrics of gate pulses for a system of two superconducting transmon qubits coupled by a resonator, a system inspired by the architecture of the IBM Quantum Experience. The metrics are obtained by numerical solution of the time-dependent Schrödinger equation of the transmon system. We find that the metrics reflect systematic errors that are most pronounced for echoed cross-resonance gates, but that none of the studied metrics can reliably predict the performance of a gate when used repeatedly in a quantum algorithm.

  19. Transport of energetic electrons in solids computer simulation with applications to materials analysis and characterization

    CERN Document Server

    Dapor, Maurizio

    2017-01-01

    This new edition describes all the mechanisms of elastic and inelastic scattering of electrons with the atoms of the target as simple as possible. The use of techniques of quantum mechanics is described in detail for the investigation of interaction processes of electrons with matter. It presents the strategies of the Monte Carlo method, as well as numerous comparisons among the results of the simulations and the experimental data available in the literature. New in this edition is the description of the Mermin theory, a comparison between Mermin theory and Drude theory, a discussion about the dispersion laws, and details about the calculation of the phase shifts that are used in the relativistic partial wave expansion method. The role of secondary electrons in proton cancer therapy is discussed in the chapter devoted to applications. In this context, Monte Carlo results about the radial distribution of the energy deposited in PMMA by secondary electrons generated by energetic proton beams are presented.

  20. Investigation of mass transfer intensification under power ultrasound irradiation using 3D computational simulation: A comparative analysis.

    Science.gov (United States)

    Sajjadi, Baharak; Asgharzadehahmadi, Seyedali; Asaithambi, Perumal; Raman, Abdul Aziz Abdul; Parthasarathy, Rajarathinam

    2017-01-01

    This paper aims at investigating the influence of acoustic streaming induced by low-frequency (24kHz) ultrasound irradiation on mass transfer in a two-phase system. The main objective is to discuss the possible mass transfer improvements under ultrasound irradiation. Three analyses were conducted: i) experimental analysis of mass transfer under ultrasound irradiation; ii) comparative analysis between the results of the ultrasound assisted mass transfer with that obtained from mechanically stirring; and iii) computational analysis of the systems using 3D CFD simulation. In the experimental part, the interactive effects of liquid rheological properties, ultrasound power and superficial gas velocity on mass transfer were investigated in two different sonicators. The results were then compared with that of mechanical stirring. In the computational part, the results were illustrated as a function of acoustic streaming behaviour, fluid flow pattern, gas/liquid volume fraction and turbulence in the two-phase system and finally the mass transfer coefficient was specified. It was found that additional turbulence created by ultrasound played the most important role on intensifying the mass transfer phenomena compared to that in stirred vessel. Furthermore, long residence time which depends on geometrical parameters is another key for mass transfer. The results obtained in the present study would help researchers understand the role of ultrasound as an energy source and acoustic streaming as one of the most important of ultrasound waves on intensifying gas-liquid mass transfer in a two-phase system and can be a breakthrough in the design procedure as no similar studies were found in the existing literature. Copyright © 2016. Published by Elsevier B.V.

  1. Computer simulation of superionic fluorides

    CERN Document Server

    Castiglione, M

    2000-01-01

    experimentally gives an indication of the correlations between nearby defects is well-reproduced. The most stringent test of simulation model transferability is presented in the studies of lead tin fluoride, in which significant 'covalent' effects are apparent. Other similarly-structured compounds are also investigated, and the reasons behind the adoption of such an unusual layered structure, and the mobility and site occupation of the anions is quantified. In this thesis the nature of ion mobility in cryolite and lead fluoride based compounds is investigated by computer simulation. The phase transition of cryolite is characterised in terms of rotation of AIF sub 6 octahedra, and the conductive properties are shown to result from diffusion of the sodium ions. The two processes appear to be unrelated. Very good agreement with NMR experimental results is found. The Pb sup 2 sup + ion has a very high polarisability, yet treatment of this property in previous simulations has been problematic. In this thesis a mor...

  2. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  3. Application of computer assisted three-dimensional simulation operation and biomechanics analysis in the treatment of sagittal craniosynostosis.

    Science.gov (United States)

    Li, Xiang; Zhu, Wanchun; He, Jintao; Di, Fei; Wang, Lei; Li, Xin; Liu, Wei; Li, Chunde; Gong, Jian

    2017-10-01

    As a surgical method to treat children with sagittal craniosynostosis, calvarial vault reconstruction is subjected to some limitations. In traditional surgical method to treat children with sagittal craniosynostosis, surgical resection and fixation are performed all by the experience of surgical doctor, which is likely to cause individual differences, insecure fixation, configurational asymmetry, and waste of unnecessary fixtures materials. This study aims to provide surgical doctor with objective indicators via 3D simulation combined with biomechanical calculation, so as to improve the surgical efficiency. The aim of this study is to compare preoperative strategy integrating computer-assisted 3D simulation and biomechanical calculation and traditional strategy. A retrospective method was used to compare the effect and difference between these 2 strategies. The clinical data of 18 patients with sagittal synostosis were collected and compared. Among them, 10 patients were enrolled in Group A applied with traditional treatment method, while 8 were enrolled in Group B applied with preoperative strategy integrating computer-assisted 3D simulation and biomechanical calculation. The aim of this study is to evaluate two treatment methods by investigating indexes between two groups, such as length of operation, blood loss, operation cost, and postoperative complications. Through comparing the cranial index, head circumference, and cranial vault asymmetry index of two groups before and after treatment, the surgical effects of two groups can be evaluated. Moreover, biomechanical analyses for two groups were conducted. Regarding group B, the length of operation was (217±29.3min), blood loss was (70±11.7ml), operation cost was (34,495±8662¥); while for group A, the length of operation was (276±23.5min), blood loss was (90±15.5ml), operation cost was (25,149±4133¥). No postoperative complication was observed for group B, while there was 1 case of central nervous system

  4. Computational plasticity algorithm for particle dynamics simulations

    Science.gov (United States)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2018-01-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  5. Integration of Computer Tomography and Simulation Analysis in Evaluation of Quality of Ceramic-Carbon Bonded Foam Filter

    Directory of Open Access Journals (Sweden)

    Karwiński A.

    2013-12-01

    Full Text Available Filtration of liquid casting alloys is used in casting technologies for long time. The large quantity of available casting filters allows using them depending on casting technology, dimensions of casting and used alloys. Technological progress of material science allows of using new materials in production of ceramic filters. In this article the Computed Tomography (CT technique was use in order to evaluate the thickness of branch in cross section of 20ppi ceramic-carbon bonded foam filter. Than the 3D image of foam filter was used in computer simulation of flow of liquid metal thru the running system.

  6. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  7. Priority Queues for Computer Simulations

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  8. Self-propagating exothermic reaction analysis in Ti/Al reactive films using experiments and computational fluid dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sen, Seema, E-mail: seema.sen@tu-ilmenau.de [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany); Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Lake, Markus; Kroppen, Norman; Farber, Peter; Wilden, Johannes [Niederrhein University of Applied Science, Department of Mechanical and Process Engineering, Reinarzstraße 49, 47805 Krefeld (Germany); Schaaf, Peter [Technical University of Ilmenau, Department of Materials for Electronics, Gustav-Kirchhoff-Str. 5, 98693 Ilmenau (Germany)

    2017-02-28

    Highlights: • Development of nanoscale Ti/Al multilayer films with 1:1, 1:2 and 1:3 molar ratios. • Characterization of exothermic reaction propagation by experiments and simulation. • The reaction velocity depends on the ignition potentials and molar ratios of the films. • Only 1Ti/3Al films exhibit the unsteady reaction propagation with ripple formation. • CFD simulation shows the time dependent atom mixing and temperature flow during exothermic reaction. - Abstract: This study describes the self-propagating exothermic reaction in Ti/Al reactive multilayer foils by using experiments and computational fluid dynamics simulation. The Ti/Al foils with different molar ratios of 1Ti/1Al, 1Ti/2Al and 1Ti/3Al were fabricated by magnetron sputtering method. Microstructural characteristics of the unreacted and reacted foils were analyzed by using electronic and atomic force microscopes. After an electrical ignition, the influence of ignition potentials on reaction propagation has been experimentally investigated. The reaction front propagates with a velocity of minimum 0.68 ± 0.4 m/s and maximum 2.57 ± 0.6 m/s depending on the input ignition potentials and the chemical compositions. Here, the 1Ti/3Al reactive foil exhibits both steady state and unsteady wavelike reaction propagation. Moreover, the numerical computational fluid dynamics (CFD) simulation shows the time dependent temperature flow and atomic mixing in a nanoscale reaction zone. The CFD simulation also indicates the potentiality for simulating exothermic reaction in the nanoscale Ti/Al foil.

  9. QCE : A Simulator for Quantum Computer Hardware

    NARCIS (Netherlands)

    Michielsen, Kristel; Raedt, Hans De

    2003-01-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.

  10. Finding Tropical Cyclones on a Cloud Computing Cluster: Using Parallel Virtualization for Large-Scale Climate Simulation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hasenkamp, Daren; Sim, Alexander; Wehner, Michael; Wu, Kesheng

    2010-09-30

    Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, while we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.

  11. Computer Simulation in Tomorrow's Schools.

    Science.gov (United States)

    Foster, David

    1984-01-01

    Suggests use of simulation as an educational strategy has promise for the school of the future; discusses specific advantages of simulations over alternative educational methods, role of microcomputers in educational simulation, and past obstacles and future promise of microcomputer simulations; and presents a literature review on effectiveness of…

  12. Computer simulations and theory of protein translocation.

    Science.gov (United States)

    Makarov, Dmitrii E

    2009-02-17

    AFM experiments, single-molecule experimental studies of protein translocation have just started to emerge. We describe one example of a collaborative study, in which dwell times of beta-hairpin-forming peptides inside the alpha-hemolysin pore were both measured experimentally and estimated using computer simulations. Analysis of the simulated trajectories has explained the experimental finding that more stable hairpins take, on the average, longer to traverse the pore. Despite the insight we have gained, the general relationship between the structure of proteins and their resistance to mechanically driven co-translocational unfolding remains poorly understood. Future theoretical progress likely will be made in conjunction with single-molecule experiments and will require realistic models to account for specific protein-pore interactions and for solvent effects.

  13. Simulation of dynamic behaviour of a digital displacement motor using transient 3d computational fluid dynamics analysis

    DEFF Research Database (Denmark)

    Rømer, Daniel; Johansen, Per; Pedersen, Henrik C.

    2013-01-01

    the simulation using layering zones as required by the moving fluid boundaries. The effect of cavitation at low pressures is included by implementing a pressure dependent density, based on an effective bulk modulus model. In addition, pressure dependent oil viscosity is included in the analysis. As a result......A fast rotating 1500 rpm radial piston digital displacement motor connected to a 350 bar high pressure manifold is simulated by means of transient 3D CFD analysis of a single pressure chamber. The analysis includes dynamic piston and valve movement, influencing the boundaries of the fluid domain....... Movement of the low and high pressure valves is coupled to fluid forces and valve actuation is included to control the valve movement according to the pressure cycle of the digital displacement motor. The fluid domain is meshed using a structured/unstructured non-conformal mesh, which is updated throughout...

  14. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    Department of Computer. Science and Automation. Indian Institute of Science. Bangalore 560 012. Email: mjt@csa.iisc.ernet.in. Computers can be used to simulate the operation of complex systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple ...

  15. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  16. Computer Simulation of the Neuronal Action Potential.

    Science.gov (United States)

    Solomon, Paul R.; And Others

    1988-01-01

    A series of computer simulations of the neuronal resting and action potentials are described. Discusses the use of simulations to overcome the difficulties of traditional instruction, such as blackboard illustration, which can only illustrate these events at one point in time. Describes systems requirements necessary to run the simulations.…

  17. Computer Simulation of a Hardwood Processing Plant

    Science.gov (United States)

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  18. The Use of Model Matching Video Analysis and Computational Simulation to Study the Ankle Sprain Injury Mechanism

    Directory of Open Access Journals (Sweden)

    Daniel Tik-Pui Fong

    2012-10-01

    Full Text Available Lateral ankle sprains continue to be the most common injury sustained by athletes and create an annual healthcare burden of over $4 billion in the U.S. alone. Foot inversion is suspected in these cases, but the mechanism of injury remains unclear. While kinematics and kinetics data are crucial in understanding the injury mechanisms, ligament behaviour measures – such as ligament strains – are viewed as the potential causal factors of ankle sprains. This review article demonstrates a novel methodology that integrates model matching video analyses with computational simulations in order to investigate injury-producing events for a better understanding of such injury mechanisms. In particular, ankle joint kinematics from actual injury incidents were deduced by model matching video analyses and then input into a generic computational model based on rigid bone surfaces and deformable ligaments of the ankle so as to investigate the ligament strains that accompany these sprain injuries. These techniques may have the potential for guiding ankle sprain prevention strategies and targeted rehabilitation therapies.

  19. Analysis of Chlorine Gas Incident Simulation and Dispersion Within a Complex and Populated Urban Area Via Computation Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    Eslam Kashi

    2015-04-01

    Full Text Available In some instances, it is inevitable that large amounts of potentially hazardous chemicals like chlorine gas are stored and used in facilities in densely populated areas. In such cases, all safety issues must be carefully considered. To reach this goal, it is important to have accurate information concerning chlorine gas behaviors and how it is dispersed in dense urban areas. Furthermore, maintaining adequate air movement and the ability to purge ambient from potential toxic and dangerous chemicals like chlorine gas could be helpful. These are among the most important actions to be taken toward the improvement of safety in a big metropolis like Tehran. This paper investigates and analyzes chlorine gas leakage scenarios, including its dispersion and natural air ventilation  effects on how it might be geographically spread in a city, using computational  fluid dynamic (CFD. Simulations of possible hazardous events and solutions for preventing or reducing their probability are presented to gain a better insight into the incidents. These investigations are done by considering hypothetical scenarios which consist of chlorine gas leakages from pipelines or storage tanks under different conditions. These CFD simulation results are used to investigate and analyze chlorine gas behaviors, dispersion, distribution, accumulation, and other possible hazards by means of a simplified CAD model of an urban area near a water-treatment facility. Possible hazards as well as some prevention and post incident solutions are also suggested.

  20. Computer simulations analysis for determining the polarity of charge generated by high energy electron irradiation of a thin film.

    Science.gov (United States)

    Malac, Marek; Hettler, Simon; Hayashida, Misa; Kawasaki, Masahiro; Konyuba, Yuji; Okura, Yoshi; Iijima, Hirofumi; Ishikawa, Isamu; Beleggia, Marco

    2017-09-01

    Detailed simulations are necessary to correctly interpret the charge polarity of electron beam irradiated thin film patch. Relying on systematic simulations we provide guidelines and movies to interpret experimentally the polarity of the charged area, to be understood as the sign of the electrostatic potential developed under the beam with reference to a ground electrode. We discuss the two methods most frequently used to assess charge polarity: Fresnel imaging of the irradiated area and Thon rings analysis. We also briefly discuss parameter optimization for hole free phase plate (HFPP) imaging. Our results are particularly relevant to understanding contrast of hole-free phase plate imaging and Berriman effect. Copyright © 2017. Published by Elsevier Ltd.

  1. FEL Simulation Using Distributed Computing

    Energy Technology Data Exchange (ETDEWEB)

    Einstein, Joshua [Fermilab; Bernabeu Altayo, Gerard [Fermilab; Biedron, Sandra [Ljubljana U.; Freund, Henry [Colorado State U., Fort Collins; Milton, Stephen [Colorado State U., Fort Collins; van der Slot, Peter [Colorado State U., Fort Collins

    2016-06-01

    While simulation tools are available and have been used regularly for simulating light sources, the increasing availability and lower cost of GPU-based processing opens up new opportunities. This poster highlights a method of how accelerating and parallelizing code processing through the use of COTS software interfaces.

  2. Micro-computer simulation software: A review

    Directory of Open Access Journals (Sweden)

    P.S. Kruger

    2003-12-01

    Full Text Available Simulation modelling has proved to be one of the most powerful tools available to the Operations Research Analyst. The development of micro-computer technology has reached a state of maturity where the micro-computer can provide the necessary computing power and consequently various powerful and inexpensive simulation languages for micro-computers have became available. This paper will attempt to provide an introduction to the general philosophy and characteristics of some of the available micro-computer simulation languages. The emphasis will be on the characteristics of the specific micro-computer implementation rather than on a comparison of the modelling features of the various languages. Such comparisons may be found elsewhere.

  3. Computer simulation in physics and engineering

    CERN Document Server

    Steinhauser, Martin Oliver

    2013-01-01

    This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.

  4. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I.

    1997-01-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three- dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements.

  5. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I. [Lawrence Livermore National Lab., CA (United States)

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  6. Apu/hydraulic/actuator Subsystem Computer Simulation. Space Shuttle Engineering and Operation Support, Engineering Systems Analysis. [for the space shuttle

    Science.gov (United States)

    1975-01-01

    Major developments are examined which have taken place to date in the analysis of the power and energy demands on the APU/Hydraulic/Actuator Subsystem for space shuttle during the entry-to-touchdown (not including rollout) flight regime. These developments are given in the form of two subroutines which were written for use with the Space Shuttle Functional Simulator. The first subroutine calculates the power and energy demand on each of the three hydraulic systems due to control surface (inboard/outboard elevons, rudder, speedbrake, and body flap) activity. The second subroutine incorporates the R. I. priority rate limiting logic which limits control surface deflection rates as a function of the number of failed hydraulic. Typical results of this analysis are included, and listings of the subroutines are presented in appendicies.

  7. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    Many complex systems of importance to the U.S. Department of Energy consist of networks of discrete components. Examples are cyber networks, such as the internet and local area networks over which nearly all DOE scientific, technical and administrative data must travel, the electric power grid, social networks whose behavior can drive energy demand, and biological networks such as genetic regulatory networks and metabolic networks. In spite of the importance of these complex networked systems to all aspects of DOE's operations, the scientific basis for understanding these systems lags seriously behind the strong foundations that exist for the 'physically-based' systems usually associated with DOE research programs that focus on such areas as climate modeling, fusion energy, high-energy and nuclear physics, nano-science, combustion, and astrophysics. DOE has a clear opportunity to develop a similarly strong scientific basis for understanding the structure and dynamics of networked systems by supporting a strong basic research program in this area. Such knowledge will provide a broad basis for, e.g., understanding and quantifying the efficacy of new security approaches for computer networks, improving the design of computer or communication networks to be more robust against failures or attacks, detecting potential catastrophic failure on the power grid and preventing or mitigating its effects, understanding how populations will respond to the availability of new energy sources or changes in energy policy, and detecting subtle vulnerabilities in large software systems to intentional attack. This white paper outlines plans for an aggressive new research program designed to accelerate the advancement of the scientific basis for complex networked systems of importance to the DOE. It will focus principally on four research areas: (1) understanding network structure, (2) understanding network dynamics, (3) predictive modeling and simulation for complex

  8. Comparative analysis of cervical spine management in a subset of severe traumatic brain injury cases using computer simulation.

    Directory of Open Access Journals (Sweden)

    Kimbroe J Carter

    Full Text Available BACKGROUND: No randomized control trial to date has studied the use of cervical spine management strategies in cases of severe traumatic brain injury (TBI at risk for cervical spine instability solely due to damaged ligaments. A computer algorithm is used to decide between four cervical spine management strategies. A model assumption is that the emergency room evaluation shows no spinal deficit and a computerized tomogram of the cervical spine excludes the possibility of fracture of cervical vertebrae. The study's goal is to determine cervical spine management strategies that maximize brain injury functional survival while minimizing quadriplegia. METHODS/FINDINGS: The severity of TBI is categorized as unstable, high risk and stable based on intracranial hypertension, hypoxemia, hypotension, early ventilator associated pneumonia, admission Glasgow Coma Scale (GCS and age. Complications resulting from cervical spine management are simulated using three decision trees. Each case starts with an amount of primary and secondary brain injury and ends as a functional survivor, severely brain injured, quadriplegic or dead. Cervical spine instability is studied with one-way and two-way sensitivity analyses providing rankings of cervical spine management strategies for probabilities of management complications based on QALYs. Early collar removal received more QALYs than the alternative strategies in most arrangements of these comparisons. A limitation of the model is the absence of testing against an independent data set. CONCLUSIONS: When clinical logic and components of cervical spine management are systematically altered, changes that improve health outcomes are identified. In the absence of controlled clinical studies, the results of this comparative computer assessment show that early collar removal is preferred over a wide range of realistic inputs for this subset of traumatic brain injury. Future research is needed on identifying factors in

  9. Computer simulation analysis on EEVC pedestrian subsystem impact test. Evaluation of impact energy in upper legform test; EEVC hokosha hogo shikenhoan ni kansuru computer simulation kaiseki. Daitaibu shikenhoan de teiansareta shototsu energy no datosei ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Konosu, A.; Ishikawa, H. [Japan Automobile Research Institute Inc., Tsukuba (Japan)

    1999-11-01

    EEVC upper legform test conditions are determined exclusively from car-front shapes, bonnet leading edge (LEH) and bumper lead (BL), without considering car-front stiffness. However, the car-front stiffness may affect the test conditions significantly. Furthermore, it seems that the EEVC test condition was obtained from computer simulation using a dummy-like pedestrian model, instead of a human-like pedestrian model. Our computer simulation results indicated that car-front-stiffness varied impact energy 300 J at maximum, and impact energy obtained using the dummy-like pedestrian model was 93 to 384 J higher as compared to those obtained from the human-like pedestrian model. In order to evaluate vehicle safety performance in car-pedestrian accidents appropriately, the current EEVC impact energy curve of upper legform test should be reconsidered. (author)

  10. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  11. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  12. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    This is supposed to recall gambling and hence the name Monte Carlo simulation. The procedure was developed by. Stanislaw Ulam and John Van Neumann. They used the simu- lation method to solve partial differential equations for diffu- sion of neutrons! (Box 2). We can illustrate the MC method by a simple example.

  13. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  14. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  15. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  16. Computer Systems/Database Simulation.

    Science.gov (United States)

    1978-10-15

    defined distribution of inter-arrival times. Hence the process of model building and execution is considerably eased with the help of simulation langauges ...the hands of only the data creater need not be forwarded to the data user. This removes both JCL and format diffi-- culties from the users domain . 3...emulators avail- iblte on any machine for most source langauges .) Lower level languages, such as Assembler or Macro-like code will always be machine

  17. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  18. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  19. Direct Dynamic Kinetic Analysis and Computer Simulation of Growth of Clostridium perfringens in Cooked Turkey during Cooling.

    Science.gov (United States)

    Huang, Lihan; Vinyard, Bryan T

    2016-03-01

    This research applied a new 1-step methodology to directly construct a tertiary model that describes the growth of Clostridium perfringens in cooked turkey meat under dynamically cooling conditions. The kinetic parameters of the growth models were determined by numerical analysis and optimization using multiple dynamic growth curves. The models and kinetic parameters were validated using independent growth curves obtained under various cooling conditions. The results showed that the residual errors (ε) of the predictions followed a Laplace distribution that is symmetric with respect to ε = 0. For residual errors, 90.6% are within ±0.5 Log CFU/g and 73.4% are ±0.25 Log CFU/g for all growth curves used for validation. For relative growth growth of growth of C. perfringens. Monte Carlo simulation was used to estimate the probabilities of >1.0 and 2.0 Log CFU/g relative growth of C. perfringens in the final products at the end of cooling. This probabilistic process analysis approach provides a new alternative for estimating and managing the risk of a product and can help the food industry and regulatory agencies assess the safety of cooked meat in the event of cooling deviation. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  20. Computational Music Analysis

    DEFF Research Database (Denmark)

    in this intensely interdisciplinary field. A broad range of approaches are presented, employing techniques originating in disciplines such as linguistics, information theory, information retrieval, pattern recognition, machine learning, topology, algebra and signal processing. Many of the methods described draw...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  1. Dynamic determination of kinetic parameters, computer simulation, and probabilistic analysis of growth of Clostridium perfringens in cooked beef during cooling.

    Science.gov (United States)

    Huang, Lihan

    2015-02-16

    The objective of this research was to develop a new one-step methodology that uses a dynamic approach to directly construct a tertiary model for prediction of the growth of Clostridium perfringens in cooked beef. This methodology was based on simultaneous numerical analysis and optimization of both primary and secondary models using multiple dynamic growth curves obtained under different conditions. Once the models were constructed, the bootstrap method was used to calculate the 95% confidence intervals of kinetic parameters, and a Monte Carlo simulation method was developed to validate the models using the growth curves not previously used in model development. The results showed that the kinetic parameters obtained from this study accurately matched the common characteristics of C. perfringens, with the optimum temperature being 45.3°C. The results also showed that the predicted growth curves matched accurately with experimental observations used in validation. The mean of residuals of the predictions is -0.02logCFU/g, with a standard deviation of only 0.23logCFU/g. For relative growths 0.4logCFU/g, while only 1.5% are >0.8logCFU/g. In addition, the dynamic model also accurately predicted four isothermal growth curves arbitrarily chosen from the literature. Finally, the Monte Carlo simulation was used to provide the probability of >1 and 2logCFU/g relative growths at the end of cooling. The results of this study will provide a new and accurate tool to the food industry and regulatory agencies to assess the safety of cooked beef in the event of cooling deviation. Published by Elsevier B.V.

  2. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    cavity was simulated with a nonrigid, partially reflecting heavy gas (the rigid wall of 905.0021 was replaced with additional cells of ideal gas which...the shock tunnel at the 4.14-Mpa range found in calculation 906.1081. The driver consisted of 25 cells of burned ammonium nitrate and fuel oil ( ANFO ...mm AX = 250 mm Reflected Wave Geometry--Calculation 906.1091 65 m Driver Region Reaction Region Boundary Burned Rigid ANFO Real Air Reflecting k 90.6

  3. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  4. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  5. Water Quality Analysis Simulation

    Science.gov (United States)

    The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.

  6. A computer simulator for development of engineering system design methodologies

    Science.gov (United States)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  7. Flow simulation and high performance computing

    Science.gov (United States)

    Tezduyar, T.; Aliabadi, S.; Behr, M.; Johnson, A.; Kalro, V.; Litke, M.

    1996-10-01

    Flow simulation is a computational tool for exploring science and technology involving flow applications. It can provide cost-effective alternatives or complements to laboratory experiments, field tests and prototyping. Flow simulation relies heavily on high performance computing (HPC). We view HPC as having two major components. One is advanced algorithms capable of accurately simulating complex, real-world problems. The other is advanced computer hardware and networking with sufficient power, memory and bandwidth to execute those simulations. While HPC enables flow simulation, flow simulation motivates development of novel HPC techniques. This paper focuses on demonstrating that flow simulation has come a long way and is being applied to many complex, real-world problems in different fields of engineering and applied sciences, particularly in aerospace engineering and applied fluid mechanics. Flow simulation has come a long way because HPC has come a long way. This paper also provides a brief review of some of the recently-developed HPC methods and tools that has played a major role in bringing flow simulation where it is today. A number of 3D flow simulations are presented in this paper as examples of the level of computational capability reached with recent HPC methods and hardware. These examples are, flow around a fighter aircraft, flow around two trains passing in a tunnel, large ram-air parachutes, flow over hydraulic structures, contaminant dispersion in a model subway station, airflow past an automobile, multiple spheres falling in a liquid-filled tube, and dynamics of a paratrooper jumping from a cargo aircraft.

  8. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  9. Computer Simulations of Lipid Nanoparticles

    Directory of Open Access Journals (Sweden)

    Xavier F. Fernandez-Luengo

    2017-12-01

    Full Text Available Lipid nanoparticles (LNP are promising soft matter nanomaterials for drug delivery applications. In spite of their interest, little is known about the supramolecular organization of the components of these self-assembled nanoparticles. Here, we present a molecular dynamics simulation study, employing the Martini coarse-grain forcefield, of self-assembled LNPs made by tripalmitin lipid in water. We also study the adsorption of Tween 20 surfactant as a protective layer on top of the LNP. We show that, at 310 K (the temperature of interest in biological applications, the structure of the lipid nanoparticles is similar to that of a liquid droplet, in which the lipids show no nanostructuration and have high mobility. We show that, for large enough nanoparticles, the hydrophilic headgroups develop an interior surface in the NP core that stores liquid water. The surfactant is shown to organize in an inhomogeneous way at the LNP surface, with patches with high surfactant concentrations and surface patches not covered by surfactant.

  10. VIBA-Lab 3.0: Computer program for simulation and semi-quantitative analysis of PIXE and RBS spectra and 2D elemental maps

    Science.gov (United States)

    Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana

    2015-11-01

    VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.

  11. Computer simulations analysis for determining the polarity of charge generated by high energy electron irradiation of a thin film

    DEFF Research Database (Denmark)

    Malac, Marek; Hettler, Simon; Hayashida, Misa

    2017-01-01

    Detailed simulations are necessary to correctly interpret the charge polarity of electron beam irradiated thin film patch. Relying on systematic simulations we provide guidelines and movies to interpret experimentally the polarity of the charged area, to be understood as the sign of the electrost...

  12. Enabling Computational Technologies for Terascale Scientific Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  13. Computation simulation of the nonlinear response of suspension bridges

    Energy Technology Data Exchange (ETDEWEB)

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  14. Curved Beam Computed Tomography based Structural Rigidity Analysis of Bones with Simulated Lytic Defect: A Comparative Study with Finite Element Analysis

    NARCIS (Netherlands)

    Oftadeh, R.; Karimi, Z.; Villa-Camacho, J.; Tanck, E.; Verdonschot, Nicolaas Jacobus Joseph; Goebel, R.; Snyder, B.D.; Hashemi, H.N.; Vaziri, A.; Nazarian, A.

    2016-01-01

    In this paper, a CT based structural rigidity analysis (CTRA) method that incorporates bone intrinsic local curvature is introduced to assess the compressive failure load of human femur with simulated lytic defects. The proposed CTRA is based on a three dimensional curved beam theory to obtain

  15. Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2011-01-01

    Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…

  16. Electric Propulsion Plume Simulations Using Parallel Computer

    Directory of Open Access Journals (Sweden)

    Joseph Wang

    2007-01-01

    Full Text Available A parallel, three-dimensional electrostatic PIC code is developed for large-scale electric propulsion simulations using parallel supercomputers. This code uses a newly developed immersed-finite-element particle-in-cell (IFE-PIC algorithm designed to handle complex boundary conditions accurately while maintaining the computational speed of the standard PIC code. Domain decomposition is used in both field solve and particle push to divide the computation among processors. Two simulations studies are presented to demonstrate the capability of the code. The first is a full particle simulation of near-thruster plume using real ion to electron mass ratio. The second is a high-resolution simulation of multiple ion thruster plume interactions for a realistic spacecraft using a domain enclosing the entire solar array panel. Performance benchmarks show that the IFE-PIC achieves a high parallel efficiency of ≥ 90%

  17. Time reversibility, computer simulation, and chaos

    CERN Document Server

    Hoover, William Graham

    1999-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful

  18. GATE Monte Carlo simulation in a cloud computing environment

    Science.gov (United States)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  19. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...... statistical power analysis as a way to determine the appropriate number of runs. Two examples are then produced using results from an agent-based model. The reader is then guided through the application of this statistical technique and exposed to its limits and potentials....

  20. Perspective: Computer simulations of long time dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

  1. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    Science.gov (United States)

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  2. Micromechanics-Based Computational Simulation of Ceramic Matrix Composites

    Science.gov (United States)

    Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)

    2003-01-01

    Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.

  3. Quantitative computer simulations of extraterrestrial processing operations

    Science.gov (United States)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  4. Computer simulation of proton channelling in silicon

    Indian Academy of Sciences (India)

    2000-06-12

    Jun 12, 2000 ... Computer simulation of proton channelling in silicon. N K DEEPAK, K RAJASEKHARAN* and K NEELAKANDAN. Department of Physics, University of Calicut, Malappuram 673 635, India. *. Department of Physics, Malabar Christian College, Kozhikode 673 001, India. MS received 11 October 1999; revised ...

  5. Computer simulations of phospholipid - membrane thermodynamic fluctuations

    DEFF Research Database (Denmark)

    Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.

    2008-01-01

    This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...

  6. Spiking network simulation code for petascale computers

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  7. Spiking network simulation code for petascale computers.

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  8. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. High-performance computing MRI simulations.

    Science.gov (United States)

    Stöcker, Tony; Vahedipour, Kaveh; Pflugfelder, Daniel; Shah, N Jon

    2010-07-01

    A new open-source software project is presented, JEMRIS, the Jülich Extensible MRI Simulator, which provides an MRI sequence development and simulation environment for the MRI community. The development was driven by the desire to achieve generality of simulated three-dimensional MRI experiments reflecting modern MRI systems hardware. The accompanying computational burden is overcome by means of parallel computing. Many aspects are covered that have not hitherto been simultaneously investigated in general MRI simulations such as parallel transmit and receive, important off-resonance effects, nonlinear gradients, and arbitrary spatiotemporal parameter variations at different levels. The latter can be used to simulate various types of motion, for instance. The JEMRIS user interface is very simple to use, but nevertheless it presents few limitations. MRI sequences with arbitrary waveforms and complex interdependent modules are modeled in a graphical user interface-based environment requiring no further programming. This manuscript describes the concepts, methods, and performance of the software. Examples of novel simulation results in active fields of MRI research are given. (c) 2010 Wiley-Liss, Inc.

  10. Improvements in Thermal Performance of Mango Hot-water Treatment Equipments: Data Analysis, Mathematical Modelling and Numerical-computational Simulation

    OpenAIRE

    Mendoza Orbegoso, Elder M.; Paul Villar-Yacila; Daniel Marcelo; Justo Oquelis

    2017-01-01

    Mango is one of the most popular and best paid tropical fruits in worldwide markets, its exportation is regulated within a phytosanitary quality control for killing the “fruit fly”. Thus, mangoes must be subject to hot-water treatment process that involves their immersion in hot water over a period of time. In this work, field measurements, analytical and simulation studies are developed on available hot-water treatment equipment called “Original” that only complies wi...

  11. Development of a Network Analysis of the Air Force Provisioning Process for an Applied Computer Simulation Exercise.

    Science.gov (United States)

    1984-09-01

    appearance of readiness to act in reduction of these needs, and they can PROVIDE AN ADEQUATE SETTING, as well as the means, for an Immediate translation ...first was Oacquisition and comprehension of knowlodge .’ Here, he stated that simulation-games were probably too expensive and time consuming compared...questions were then translated into four views: anti-union, pro-employer, pro- union, and anti-employer (32:383). Out of the 16 paired comparisons, the

  12. Computer simulation boosts automation in the stockyard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-04-01

    Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.

  13. Fluid Dynamics Theory, Computation, and Numerical Simulation

    CERN Document Server

    Pozrikidis, Constantine

    2009-01-01

    Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...

  14. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  15. Computational Challenges in Nuclear Weapons Simulation

    Energy Technology Data Exchange (ETDEWEB)

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  16. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  17. Computer Simulation for Emergency Incident Management

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  18. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  19. Computer simulation of multiple dynamic photorefractive gratings

    DEFF Research Database (Denmark)

    Buchhave, Preben

    1998-01-01

    The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data....... The simulation sheds light on issues that are not amenable to analytical solutions, such as the spectral content of the wave forms, cross talk in three-beam interaction, and the range of applications of the band-transport model. (C) 1998 Optical Society of America....

  20. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    Science.gov (United States)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  1. Integrated Computational Tools for Identification of CCR5 Antagonists as Potential HIV-1 Entry Inhibitors: Homology Modeling, Virtual Screening, Molecular Dynamics Simulations and 3D QSAR Analysis

    Directory of Open Access Journals (Sweden)

    Suri Moonsamy

    2014-04-01

    Full Text Available Using integrated in-silico computational techniques, including homology modeling, structure-based and pharmacophore-based virtual screening, molecular dynamic simulations, per-residue energy decomposition analysis and atom-based 3D-QSAR analysis, we proposed ten novel compounds as potential CCR5-dependent HIV-1 entry inhibitors. Via validated docking calculations, binding free energies revealed that novel leads demonstrated better binding affinities with CCR5 compared to maraviroc, an FDA-approved HIV-1 entry inhibitor and in clinical use. Per-residue interaction energy decomposition analysis on the averaged MD structure showed that hydrophobic active residues Trp86, Tyr89 and Tyr108 contributed the most to inhibitor binding. The validated 3D-QSAR model showed a high cross-validated rcv2 value of 0.84 using three principal components and non-cross-validated r2 value of 0.941. It was also revealed that almost all compounds in the test set and training set yielded a good predicted value. Information gained from this study could shed light on the activity of a new series of lead compounds as potential HIV entry inhibitors and serve as a powerful tool in the drug design and development machinery.

  2. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  3. Computer simulation of molecular sorption in zeolites

    CERN Document Server

    Calmiano, M D

    2001-01-01

    The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computat...

  4. Understanding membrane fouling mechanisms through computational simulations

    Science.gov (United States)

    Xiang, Yuan

    This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

  5. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2012-11-01

    Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  6. Direct dynamic kinetic analysis and computer simulation of growth of Clostridium perfringens in cooked turkey during cooling

    Science.gov (United States)

    This research applied a new one-step methodology to directly construct a tertiary model for describing the growth of C. perfringens in cooked turkey meat under dynamically cooling conditions. The kinetic parameters of the growth models were determined by numerical analysis and optimization using mu...

  7. SiMon: Simulation Monitor for Computational Astrophysics

    Science.gov (United States)

    Qian, Penny Xuran; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  8. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2017-01-01

    This book provides an accessible introduction to the basic theory of fluid mechanics and computational fluid dynamics (CFD) from a modern perspective that unifies theory and numerical computation. Methods of scientific computing are introduced alongside with theoretical analysis and MATLAB® codes are presented and discussed for a broad range of topics: from interfacial shapes in hydrostatics, to vortex dynamics, to viscous flow, to turbulent flow, to panel methods for flow past airfoils. The third edition includes new topics, additional examples, solved and unsolved problems, and revised images. It adds more computational algorithms and MATLAB programs. It also incorporates discussion of the latest version of the fluid dynamics software library FDLIB, which is freely available online. FDLIB offers an extensive range of computer codes that demonstrate the implementation of elementary and advanced algorithms and provide an invaluable resource for research, teaching, classroom instruction, and self-study. This ...

  9. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  10. Computer vulnerability risk analysis.

    OpenAIRE

    2008-01-01

    The discussions presented in this dissertation have been undertaken in answer to the need for securing the intellectual assets stored on computer systems. Computer vulnerabilities and their influence on computer systems and the intellectual assets they possess are the main focus of this research. In an effort to portray the influence of vulnerabilities on a computer system, a method for assigning a measure of risk to individual vulnerabilities is proposed. This measure of risk, in turn, gives...

  11. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  12. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  13. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    Science.gov (United States)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  14. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  15. Computer simulation of electrokinetics in colloidal systems

    Science.gov (United States)

    Schmitz, R.; Starchenko, V.; Dünweg, B.

    2013-11-01

    The contribution gives a brief overview outlining how our theoretical understanding of the phenomenon of colloidal electrophoresis has improved over the decades. Particular emphasis is put on numerical calculations and computer simulation models, which have become more and more important as the level of description became more detailed and refined. Due to computational limitations, it has so far not been possible to study "perfect" models. Different complementary models have hence been developed, and their various strengths and deficiencies are briefly discussed. This is contrasted with the experimental situation, where there are still observations waiting for theoretical explanation. The contribution then outlines our recent development of a numerical method to solve the electrokinetic equations for a finite volume in three dimensions, and describes some new results that could be obtained by the approach.

  16. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  17. Computer Simulations of Intrinsically Disordered Proteins

    Science.gov (United States)

    Chong, Song-Ho; Chatterjee, Prathit; Ham, Sihyun

    2017-05-01

    The investigation of intrinsically disordered proteins (IDPs) is a new frontier in structural and molecular biology that requires a new paradigm to connect structural disorder to function. Molecular dynamics simulations and statistical thermodynamics potentially offer ideal tools for atomic-level characterizations and thermodynamic descriptions of this fascinating class of proteins that will complement experimental studies. However, IDPs display sensitivity to inaccuracies in the underlying molecular mechanics force fields. Thus, achieving an accurate structural characterization of IDPs via simulations is a challenge. It is also daunting to perform a configuration-space integration over heterogeneous structural ensembles sampled by IDPs to extract, in particular, protein configurational entropy. In this review, we summarize recent efforts devoted to the development of force fields and the critical evaluations of their performance when applied to IDPs. We also survey recent advances in computational methods for protein configurational entropy that aim to provide a thermodynamic link between structural disorder and protein activity.

  18. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  19. Multiscale Computer Simulation of Failure in Aerogels

    Science.gov (United States)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  20. Computer simulation of arcuate keratotomy for astigmatism.

    Science.gov (United States)

    Hanna, K D; Jouve, F E; Waring, G O; Ciarlet, P G

    1992-01-01

    The development of refractive corneal surgery involves numerous attempts to isolate the effect of individual factors on surgical outcome. Computer simulation of refractive keratotomy allows the surgeon to alter variables of the technique and to isolate the effect of specific factors independent of other factors, something that cannot easily be done in any of the currently available experimental models. We used the finite element numerical method to construct a mathematical model of the eye. The model analyzed stress-strain relationships in the normal corneoscleral shell and after astigmatic surgery. The model made the following assumptions: an axisymmetric eye, an idealized aspheric anterior corneal surface, transversal isotropy of the cornea, nonlinear strain tensor for large displacements, and near incompressibility of the corneoscleral shell. The eye was assumed to be fixed at the level of the optic nerve. The model described the acute elastic response of the eye to corneal surgery. We analyzed the effect of paired transverse arcuate corneal incisions for the correction of astigmatism. We evaluated the following incision variables and their effect on change in curvature of the incised and unincised meridians: length (longer, more steepening of unincised meridian), distance from the center of the cornea (farther, less flattening of incised meridian), depth (deeper, more effect), and the initial amount of astigmatism (small effect). Our finite element computer model gives reasonably accurate information about the relative effects of different surgical variables, and demonstrates the feasibility of using nonlinear, anisotropic assumptions in the construction of such a computer model. Comparison of these computer-generated results to clinically achieved results may help refine the computer model.

  1. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  2. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  3. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  4. Computers in engineering 1983; Proceedings of the International Conference and Exhibit, Chicago, IL, August 7-11, 1983. Volume 1 - Computer-aided design, manufacturing, and simulation

    Science.gov (United States)

    Cokonis, T. J.

    The papers presented in this volume provide examples of the impact of computers on present engineering practice and indicate some future trends in computer-aided design, manufacturing, and simulation. Topics discussed include computer-aided design of turbine cycle configuration, managing and development of engineering computer systems, computer-aided manufacturing with robots in the automotive industry, and computer-aided design/analysis techniques of composite materials in the cure phase. Papers are also presented on computer simulation of vehicular propulsion systems, the performance of a hydraulic system simulator in a CAD environment, and computer simulation of hovercraft heave dynamics and control.

  5. Computer Simulation of Developmental Processes and ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  6. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  7. Computer simulation of fatigue under diametrical compression.

    Science.gov (United States)

    Carmona, H A; Kun, F; Andrade, J S; Herrmann, H J

    2007-04-01

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  8. Investigating European genetic history through computer simulations.

    Science.gov (United States)

    Currat, Mathias; Silva, Nuno M

    2013-01-01

    The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.

  9. Computer Simulation of the UMER Gridded Gun

    CERN Document Server

    Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun

    2005-01-01

    The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...

  10. Computer simulation of carburizers particles heating in liquid metal

    Directory of Open Access Journals (Sweden)

    K. Janerka

    2010-01-01

    Full Text Available In this article are introduced the problems of computer simulation of carburizers particles heating (anthracite, graphite and petroleum coke, which are present in liquid metal. The diameter of particles, their quantity, relative velocity of particles and liquid metal and the thermophysical properties of materials (thermal conductivity, specific heat and thermal diffusivity have been taken into account in calculations. The analysis has been carried out in the aspect of liquid metal carburization in metallurgical furnaces.

  11. Symplectic molecular dynamics simulations on specially designed parallel computers.

    Science.gov (United States)

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  12. COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE

    Directory of Open Access Journals (Sweden)

    Constantin LUPU

    2015-07-01

    Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.

  13. Computational electronics semiclassical and quantum device modeling and simulation

    CERN Document Server

    Vasileska, Dragica; Klimeck, Gerhard

    2010-01-01

    Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of

  14. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  15. Are We Sims? How Computer Simulations Represent and What This Means for the Simulation Argument

    OpenAIRE

    Beisbart, Claus

    2017-01-01

    N. Bostrom's simulation argument and two additional assumptions imply that we likely live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and materia...

  16. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  17. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  18. Analysis of a Model for Computer Virus Transmission

    Directory of Open Access Journals (Sweden)

    Peng Qin

    2015-01-01

    Full Text Available Computer viruses remain a significant threat to computer networks. In this paper, the incorporation of new computers to the network and the removing of old computers from the network are considered. Meanwhile, the computers are equipped with antivirus software on the computer network. The computer virus model is established. Through the analysis of the model, disease-free and endemic equilibrium points are calculated. The stability conditions of the equilibria are derived. To illustrate our theoretical analysis, some numerical simulations are also included. The results provide a theoretical basis to control the spread of computer virus.

  19. Computer simulation of amorphous MIS solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Shousha, A.H.M.; El-Kosheiry, M.A. [Cairo University (Egypt). Electronics and Communications Engineering Dept.

    1997-10-01

    A computer model to simulate amorphous MIS solar cells is developed. The model is based on the self-consistent solution of the electron and hole continuity equations, together with the Poisson equation under proper boundary conditions. The program developed is used to investigate the cell performance characteristics in terms of its physical and structural parameters. The current-voltage characteristics of the solar cell are obtained under AMI solar illumination. The dependences of the short-circuit current, open-circuit voltage, fill factor and cell conversion efficiency on localized gap state density, carrier lifetime, cell thickness and surface recombination velocity are obtained and discussed. The results presented show how cell parameters can be varied to improve the cell performance characteristics. (Author)

  20. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach was evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.

  1. Towards A Novel Environment For Simulation Of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Joanna Patrzyk

    2015-01-01

    Full Text Available In this paper we analyze existing quantum computer simulation techniquesand their realizations to minimize the impact of the exponentialcomplexity of simulated quantum computations. As a result of thisinvestigation, we propose a quantum computer simulator with an integrateddevelopment environment - QuIDE - supporting development of algorithms forfuture quantum computers. The simulator simplifies building and testingquantum circuits and understand quantum algorithms in an efficient way.The development environment provides  flexibility of source codeedition and ease of graphical building of circuit diagrams.  We alsodescribe and analyze the complexity of algorithms used for simulationand present performance results of the simulator as well as results ofits deployment during university classes.

  2. Associative Memory Computing Power and Its Simulation

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  3. Associative Memory computing power and its simulation

    CERN Document Server

    Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  4. Computer simulations of the mouse spermatogenic cycle

    Directory of Open Access Journals (Sweden)

    Debjit Ray

    2014-12-01

    Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

  5. Computer-aided Instructional System for Transmission Line Simulation.

    Science.gov (United States)

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  6. [Thoughts on and probes into computer simulation of acupuncture manipulation].

    Science.gov (United States)

    Hu, Yin'e; Liu, Tangyi; Tang, Wenchao; Xu, Gang; Gao, Ming; Yang, Huayuan

    2011-08-01

    The studies of the simulation of acupuncture manipulation mainly focus on mechanical simulation and virtual simulation (SIM). In terms of mechanical simulation, the aim of the research is to develop the instruments of the simulation of acupuncture manipulation, and to apply them to the simulation or a replacement of the manual acupuncture manipulation; while the virtual simulation applies the virtual reality technology to present the manipulation in 3D real-time on the computer screen. This paper is to summarize the recent research progress on computer simulation of acupuncture manipulation at home and abroad, and thus concludes with the significance and the rising problems over the computer simulation of acupuncture manipulation. Therefore we put forward that the research on simulation manipulation should pay much attention to experts' manipulation simulation, as well as the verification studies on conformity and clinical effects.

  7. Using Computational Simulations to Confront Students' Mental Models

    Science.gov (United States)

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  8. Computer analysis of Holter electrocardiogram.

    Science.gov (United States)

    Yanaga, T; Adachi, M; Sato, Y; Ichimaru, Y; Otsuka, K

    1994-10-01

    Computer analysis is indispensable for the interpretation of Holter ECG, because it includes a large quantity of data. Computer analysis of Holter ECG is similar to that of conventional ECG, however, in computer analysis of Holter ECG, there are some difficulties such as many noise, limited analyzing time and voluminous data. The main topics in computer analysis of Holter ECG will be arrhythmias, ST-T changes, heart rate variability, QT interval, late potential and construction of database. Although many papers have been published on the computer analysis of Holter ECG, some of the papers was reviewed briefly in the present paper. We have studied on computer analysis of VPCs, ST-T changes, heart rate variability, QT interval and Cheyne-Stokes respiration during 24-hour ambulatory ECG monitoring. Further, we have studied on ambulatory palmar sweating for the evaluation of mental stress during a day. In future, the development of "the integrated Holter system", which enables the evaluation of ventricular vulnerability and modulating factor such as psychoneural hypersensitivity may be important.

  9. Stochastic analysis for finance with simulations

    CERN Document Server

    Choe, Geon Ho

    2016-01-01

    This book is an introduction to stochastic analysis and quantitative finance; it includes both theoretical and computational methods. Topics covered are stochastic calculus, option pricing, optimal portfolio investment, and interest rate models. Also included are simulations of stochastic phenomena, numerical solutions of the Black–Scholes–Merton equation, Monte Carlo methods, and time series. Basic measure theory is used as a tool to describe probabilistic phenomena. The level of familiarity with computer programming is kept to a minimum. To make the book accessible to a wider audience, some background mathematical facts are included in the first part of the book and also in the appendices. This work attempts to bridge the gap between mathematics and finance by using diagrams, graphs and simulations in addition to rigorous theoretical exposition. Simulations are not only used as the computational method in quantitative finance, but they can also facilitate an intuitive and deeper understanding of theoret...

  10. Traffic Simulations on Parallel Computers Using Domain Decomposition Techniques

    Science.gov (United States)

    1995-01-01

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be acheived by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic...

  11. Supporting hypothesis generation by learners exploring an interactive computer simulation

    NARCIS (Netherlands)

    van Joolingen, Wouter; de Jong, Anthonius J.M.

    1992-01-01

    Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This

  12. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    Science.gov (United States)

    1994-08-10

    23 Haddock, J. and O’Keefe, R., "Using Artificial Intelligence to Facilitate Manufacturing Systems Simulation," Computers & Industrial Engineering , Vol...Feedforward Neural Networks," Computers & Industrial Engineering , Vol. 21, No. 1- 4, (1991), pp. 247-251. 87 Proceedings of the 1992 Summer Computer...Using Simulation Experiments," Computers & Industrial Engineering , Vol. 22, No. 2 (1992), pp. 195-209. 119 Kuei, C. and Madu, C., "Polynomial

  13. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  14. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  15. Computational simulation of liquid rocket injector anomalies

    Science.gov (United States)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  16. Factors promoting engaged exploration with computer simulations

    Directory of Open Access Journals (Sweden)

    Noah S. Podolefsky

    2010-10-01

    Full Text Available This paper extends prior research on student use of computer simulations (sims to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze interviews with college students using PhET sims in order to demonstrate engaged exploration, and to identify factors that can promote this type of inquiry. With minimal explicit guidance, students explore the topic of wave interference in ways that bear similarity to how scientists explore phenomena. PhET sims are flexible tools which allow students to choose their own learning path, but also provide constraints such that students’ choices are generally productive. This type of inquiry is supported by sim features such as concrete connections to the real world, representations that are not available in the real world, analogies to help students make meaning of and connect across multiple representations and phenomena, and a high level of interactivity with real-time, dynamic feedback from the sim. These features of PhET sims enable students to pose questions and answer them in ways that may not be supported by more traditional educational materials.

  17. Analysis of computer networks

    CERN Document Server

    Gebali, Fayez

    2015-01-01

    This textbook presents the mathematical theory and techniques necessary for analyzing and modeling high-performance global networks, such as the Internet. The three main building blocks of high-performance networks are links, switching equipment connecting the links together, and software employed at the end nodes and intermediate switches. This book provides the basic techniques for modeling and analyzing these last two components. Topics covered include, but are not limited to: Markov chains and queuing analysis, traffic modeling, interconnection networks and switch architectures and buffering strategies.   ·         Provides techniques for modeling and analysis of network software and switching equipment; ·         Discusses design options used to build efficient switching equipment; ·         Includes many worked examples of the application of discrete-time Markov chains to communication systems; ·         Covers the mathematical theory and techniques necessary for ana...

  18. Gas chromatography–mass spectroscopy optimization by computer simulation, application to the analysis of 93 volatile organic compounds in workplace ambient air

    Energy Technology Data Exchange (ETDEWEB)

    Randon, J., E-mail: randon@univ-lyon1.fr [ISA Institut des Sciences Analytiques, Université Claude Bernard Lyon 1, 5 rue de la Doua, 69100 Villeurbanne (France); Maret, L. [ISA Institut des Sciences Analytiques, Université Claude Bernard Lyon 1, 5 rue de la Doua, 69100 Villeurbanne (France); Ferronato, C. [IRCELYON Institut de recherches sur la catalyse et l’environnement de Lyon, Université Claude Bernard Lyon1, 2 avenue Albert Einstein, 69626 Villeurbanne (France)

    2014-02-17

    Graphical abstract: -- Highlights: •Determination of GC thermodynamic retention parameters from only few preliminary experiments. •Simulation of GC separation for any kind of temperature program. •Identification of coelutions and automatic ion selection for MS quantification. •Example of application to two sets of VOC with 16 and 93 compounds. •Such methodology can be easily transposed to any set of volatile compounds. -- Abstract: GC–MS optimization method including both advantages from chromatographic separation and mass spectrometric detection was designed for a set of 93 volatile organic compounds. Only a few experiments were necessary to determine the thermodynamic retention parameters for all compounds on a RTX-VMS column. From these data, computer simulation was used in order to predict the retention times of the compounds in temperature programmed gas chromatography. Then, an automatic selection of ions from the NIST database was performed and compared to the optimum conditions (full separation of VOC). This simulation-selection procedure was used to screen a numerous set of GC and MS conditions in order to quickly design a GC–MS method whatever the set of compounds considered.

  19. Computer simulation of vasectomy for wolf control

    Science.gov (United States)

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  20. Math modeling and computer mechanization for real time simulation of rotary-wing aircraft

    Science.gov (United States)

    Howe, R. M.

    1979-01-01

    Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.

  1. Affective Computing and Sentiment Analysis

    CERN Document Server

    Ahmad, Khurshid

    2011-01-01

    This volume maps the watershed areas between two 'holy grails' of computer science: the identification and interpretation of affect -- including sentiment and mood. The expression of sentiment and mood involves the use of metaphors, especially in emotive situations. Affect computing is rooted in hermeneutics, philosophy, political science and sociology, and is now a key area of research in computer science. The 24/7 news sites and blogs facilitate the expression and shaping of opinion locally and globally. Sentiment analysis, based on text and data mining, is being used in the looking at news

  2. Detector Simulation: Data Treatment and Analysis Methods

    CERN Document Server

    Apostolakis, J

    2011-01-01

    Detector Simulation in 'Data Treatment and Analysis Methods', part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B1: Detectors for Particles and Radiation. Part 1: Principles and Methods'. This document is part of Part 1 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Section '4.1 Detector Simulation' of Chapter '4 Data Treatment and Analysis Methods' with the content: 4.1 Detector Simulation 4.1.1 Overview of simulation 4.1.1.1 Uses of detector simulation 4.1.2 Stages and types of simulation 4.1.2.1 Tools for event generation and detector simulation 4.1.2.2 Level of simulation and computation time 4.1.2.3 Radiation effects and background studies 4.1.3 Components of detector simulation 4.1.3.1 Geometry modeling 4.1.3.2 External fields 4.1.3.3 Intro...

  3. Explore Effective Use of Computer Simulations for Physics Education

    Science.gov (United States)

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  4. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  5. How Effective Is Instructional Support for Learning with Computer Simulations?

    Science.gov (United States)

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  6. Agent Based Simulation Output Analysis

    Science.gov (United States)

    2011-12-01

    over long periods of time) not to have a steady state, but apparently does. These simulation models are available free from sigmawiki.com 2.1...are used in computer animations and movies (for example, in the movie Jurassic Park) as well as to look for emergent social behavior in groups

  7. Computers for real time flight simulation: A market survey

    Science.gov (United States)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  8. Contributions of muscle imbalance and impaired growth to postural and osseous shoulder deformity following brachial plexus birth palsy: a computational simulation analysis.

    Science.gov (United States)

    Cheng, Wei; Cornwall, Roger; Crouch, Dustin L; Li, Zhongyu; Saul, Katherine R

    2015-06-01

    Two potential mechanisms leading to postural and osseous shoulder deformity after brachial plexus birth palsy are muscle imbalance between functioning internal rotators and paralyzed external rotators and impaired longitudinal growth of paralyzed muscles. Our goal was to evaluate the combined and isolated effects of these 2 mechanisms on transverse plane shoulder forces using a computational model of C5-6 brachial plexus injury. We modeled a C5-6 injury using a computational musculoskeletal upper limb model. Muscles expected to be denervated by C5-6 injury were classified as affected, with the remaining shoulder muscles classified as unaffected. To model muscle imbalance, affected muscles were given no resting tone whereas unaffected muscles were given resting tone at 30% of maximal activation. To model impaired growth, affected muscles were reduced in length by 30% compared with normal whereas unaffected muscles remained normal in length. Four scenarios were simulated: normal, muscle imbalance only, impaired growth only, and both muscle imbalance and impaired growth. Passive shoulder rotation range of motion and glenohumeral joint reaction forces were evaluated to assess postural and osseous deformity. All impaired scenarios exhibited restricted range of motion and increased and posteriorly directed compressive glenohumeral joint forces. Individually, impaired muscle growth caused worse restriction in range of motion and higher and more posteriorly directed glenohumeral forces than did muscle imbalance. Combined muscle imbalance and impaired growth caused the most restricted joint range of motion and the highest joint reaction force of all scenarios. Both muscle imbalance and impaired longitudinal growth contributed to range of motion and force changes consistent with clinically observed deformity, although the most substantial effects resulted from impaired muscle growth. Simulations suggest that treatment strategies emphasizing treatment of impaired longitudinal

  9. PCTRAN: a transient analysis code for personal computers

    Energy Technology Data Exchange (ETDEWEB)

    Lichi Cliff Po

    1988-05-01

    The PCTRAN code has been developed to enable analysis and real-time reactor simulation to be carried out on personal computers. It is designed to exploit all the advantages of personal computers, including accessibility, interactive capabilities, convenience, economy and the ability to get certain kinds of analysis performed at short notice.

  10. Challenges & Roadmap for Beyond CMOS Computing Simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).

  11. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  12. Enhanced Computer Aided Simulation of Meshing and Contact With Application for Spiral Bevel Gear Drives

    National Research Council Canada - National Science Library

    Litvin, F

    1999-01-01

    An integrated tooth contact analysis (TCA) computer program for the simulation of meshing and contact of gear drives that calculates transmission errors and shift of hearing contact for misaligned gear drives has been developed...

  13. Power grid simulation applications developed using the GridPACK™ high performance computing framework

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chen, Yousu; Diao, Ruisheng; Huang, Zhenyu (Henry); Perkins, William; Palmer, Bruce

    2016-12-01

    This paper describes the GridPACK™ software framework for developing power grid simulations that can run on high performance computing platforms, with several example applications (dynamic simulation, static contingency analysis, and dynamic contingency analysis) that have been developed using GridPACK.

  14. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  15. Computational simulation of structural fracture in fiber composites

    Science.gov (United States)

    Chamis, C. C.; Murthy, P. L. N.

    1990-01-01

    A methodology was developed for the computational simulation of structural fracture in fiber composites. This methodology consists of step-by-step procedures for mixed mode fracture in generic components and of an integrated computer code, Composite Durability Structural Analysis (CODSTRAN). The generic types of composite structural fracture include single and combined mode fracture in beams, laminate free-edge delamination fracture, and laminate center flaw progressive fracture. Structural fracture is assessed in one or all of the following: (1) the displacements increase very rapidly; (2) the frequencies decrease very rapidly; (3) the buckling loads decrease very rapidly; or (4) the strain energy release rate increases very rapidly. These rapid changes are herein assumed to denote imminent structural fracture. Based on these rapid changes, parameters/guidelines are identified which can be used as criteria for structural fracture, inspection intervals, and retirement for cause.

  16. High performance computing system for flight simulation at NASA Langley

    Science.gov (United States)

    Cleveland, Jeff I., II; Sudik, Steven J.; Grove, Randall D.

    1991-01-01

    The computer architecture and components used in the NASA Langley Advanced Real-Time Simulation System (ARTSS) are briefly described and illustrated with diagrams and graphs. Particular attention is given to the advanced Convex C220 processing units, the UNIX-based operating system, the software interface to the fiber-optic-linked Computer Automated Measurement and Control system, configuration-management and real-time supervisor software, ARTSS hardware modifications, and the current implementation status. Simulation applications considered include the Transport Systems Research Vehicle, the Differential Maneuvering Simulator, the General Aviation Simulator, and the Visual Motion Simulator.

  17. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...

  18. A note on simulated annealing to computer laboratory scheduling ...

    African Journals Online (AJOL)

    The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...

  19. Macroevolution simulated with autonomously replicating computer programs.

    Science.gov (United States)

    Yedid, Gabriel; Bell, Graham

    The process of adaptation occurs on two timescales. In the short term, natural selection merely sorts the variation already present in a population, whereas in the longer term genotypes quite different from any that were initially present evolve through the cumulation of new mutations. The first process is described by the mathematical theory of population genetics. However, this theory begins by defining a fixed set of genotypes and cannot provide a satisfactory analysis of the second process because it does not permit any genuinely new type to arise. The evolutionary outcome of selection acting on novel variation arising over long periods is therefore difficult to predict. The classical problem of this kind is whether 'replaying the tape of life' would invariably lead to the familiar organisms of the modern biota. Here we study the long-term behaviour of populations of autonomously replicating computer programs and find that the same type, introduced into the same simple environment, evolves on any given occasion along a unique trajectory towards one of many well-adapted end points.

  20. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  1. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    Science.gov (United States)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  2. Computer Simulation and Operating Characteristics of a Three-Phase Brushless Synchronous Generator

    OpenAIRE

    Cingoski, Vlatko; Mikami, Mitsuru; Inoue, Kenji; Kaneda, Kazufumi; Yamashita, Hideo

    1998-01-01

    This paper deals with numerical computation and simulation of the operating conditions of a three-phase brushless synchronous generator. A voltage driven nonlinear time-periodic finite element analysis is utilized to compute accurately the magnetic field distribution and the induced voltage and currents. The computation procedure is briefly addressed followed by the computed results and their comparison with experimental ones. The agreement between results is very good verifying the computati...

  3. Symbolic Computations in Simulations of Hydromagnetic Dynamo

    Directory of Open Access Journals (Sweden)

    Vodinchar Gleb

    2017-01-01

    Full Text Available The compilation of spectral models of geophysical fluid dynamics and hydromagnetic dynamo involves the calculation of a large number of volume integrals from complex combinations of basis fields. In this paper we describe the automation of this computation with the help of systems of symbolic computations.

  4. The Analysis of Ship Air Defense: The Simulation Model SEAROADS

    NARCIS (Netherlands)

    Dongen, M.P.F.M. van; Kos, J.

    1995-01-01

    The Simulation, Evaluation, Analysis, and Research On Air Defense Systems model (SEAROADS) is a computer simulation model for evaluating, analyzing, and studying the performance of air defense systems aboard naval frigates. The SEAROADS model simulates an engagement between a given ship

  5. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...... effort, however, has been focused on the influence of the geometry. This work provides an investigation of geometrically different computer simulated persons with respect to both local and global airflow distribution. The results show that a simple geometry is sufficient when the global airflow...... of a ventilated enclosure is considered, as little or no influence of geometry was observed at some distance from the computer simulated person. For local flow conditions, though, a more detailed geometry should be applied in order to assess thermal and atmospheric comfort....

  6. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2017-08-10

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  7. High-resolution computer simulations of EKC.

    Science.gov (United States)

    Breadmore, Michael C; Quirino, Joselito P; Thormann, Wolfgang

    2009-02-01

    The electrophoresis simulation software, GENTRANS, has been modified to include the interaction of analytes with an electrolyte additive to allow the simulation of liquid-phase EKC separations. The modifications account for interaction of weak and strong acid and base analytes with a single weak or strong acid or base background electrolyte additive and can be used to simulate a range of EKC separations with both charged and neutral additives. Simulations of separations of alkylphenyl ketones under real experimental conditions were performed using mobility and interaction constant data obtained from the literature and agreed well with experimental separations. Migration times in fused-silica capillaries and linear polyacrylamide-coated capillaries were within 7% of the experimental values, while peak widths were always narrower than the experimental values, but were still within 50% of those obtained by experiment. Simulations of sweeping were also performed; although migration time agreement was not as good as for simple EKC separations, peak widths were in good agreement, being within 1-50% of the experimental values. All simulations for comparison with experimental data were performed under real experimental conditions using a 47 cm capillary and a voltage of 20 kV and represent the first quantitative attempt at simulating EKC separations with and without sweeping.

  8. Computer vision in microstructural analysis

    Science.gov (United States)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  9. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  10. Some theoretical issues on computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  11. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  12. GENOA-PFA: Progressive Fracture in Composites Simulated Computationally

    Science.gov (United States)

    Murthy, Pappu L. N.

    2000-01-01

    GENOA-PFA is a commercial version of the Composite Durability Structural Analysis (CODSTRAN) computer program that simulates the progression of damage ultimately leading to fracture in polymer-matrix-composite (PMC) material structures under various loading and environmental conditions. GENOA-PFA offers several capabilities not available in other programs developed for this purpose, making it preferable for use in analyzing the durability and damage tolerance of complex PMC structures in which the fiber reinforcements occur in two- and three-dimensional weaves and braids. GENOA-PFA implements a progressive-fracture methodology based on the idea that a structure fails when flaws that may initially be small (even microscopic) grow and/or coalesce to a critical dimension where the structure no longer has an adequate safety margin to avoid catastrophic global fracture. Damage is considered to progress through five stages: (1) initiation, (2) growth, (3) accumulation (coalescence of propagating flaws), (4) stable propagation (up to the critical dimension), and (5) unstable or very rapid propagation (beyond the critical dimension) to catastrophic failure. The computational simulation of progressive failure involves formal procedures for identifying the five different stages of damage and for relating the amount of damage at each stage to the overall behavior of the deteriorating structure. In GENOA-PFA, mathematical modeling of the composite physical behavior involves an integration of simulations at multiple, hierarchical scales ranging from the macroscopic (lamina, laminate, and structure) to the microscopic (fiber, matrix, and fiber/matrix interface), as shown in the figure. The code includes algorithms to simulate the progression of damage from various source defects, including (1) through-the-thickness cracks and (2) voids with edge, pocket, internal, or mixed-mode delaminations.

  13. Numerical Implementation and Computer Simulation of Tracer ...

    African Journals Online (AJOL)

    , was most dependent on the source definition and the hydraulic conductivity K of the porous medium. The 12000mg/l chloride tracer source was almost completely dispersed within 34 hours. Keywords: Replication, Numerical simulation, ...

  14. Computational Simulation of Droplet Collision Dynamics

    National Research Council Canada - National Science Library

    Law, Chung

    2000-01-01

    ..., and the energy partition among the various modes was identified. By using the molecular dynamics method, bouncing and coalescence were successfully simulated for the first time without the artificial manipulation of the inter-droplet gaseous film...

  15. Computational snow avalanche simulation in forested terrain

    Science.gov (United States)

    Teich, M.; Fischer, J.-T.; Feistl, T.; Bebi, P.; Christen, M.; Grêt-Regamey, A.

    2014-08-01

    Two-dimensional avalanche simulation software operating in three-dimensional terrain is widely used for hazard zoning and engineering to predict runout distances and impact pressures of snow avalanche events. Mountain forests are an effective biological protection measure against avalanches; however, the protective capacity of forests to decelerate or even to stop avalanches that start within forested areas or directly above the treeline is seldom considered in this context. In particular, runout distances of small- to medium-scale avalanches are strongly influenced by the structural conditions of forests in the avalanche path. We present an evaluation and operationalization of a novel detrainment function implemented in the avalanche simulation software RAMMS for avalanche simulation in forested terrain. The new approach accounts for the effect of forests in the avalanche path by detraining mass, which leads to a deceleration and runout shortening of avalanches. The relationship is parameterized by the detrainment coefficient K [kg m-1 s-2] accounting for differing forest characteristics. We varied K when simulating 40 well-documented small- to medium-scale avalanches, which were released in and ran through forests of the Swiss Alps. Analyzing and comparing observed and simulated runout distances statistically revealed values for K suitable to simulate the combined influence of four forest characteristics on avalanche runout: forest type, crown closure, vertical structure and surface cover, for example, values for K were higher for dense spruce and mixed spruce-beech forests compared to open larch forests at the upper treeline. Considering forest structural conditions within avalanche simulations will improve current applications for avalanche simulation tools in mountain forest and natural hazard management.

  16. Galileo Signal Generation. Simulation Analysis

    OpenAIRE

    Canalda Pedrós, Roger

    2009-01-01

    Projecte realitzat eb col.laboració amb el Department of Computer and Electronic Engineering. University of Limerick This work presents the navigation signals and their allocation in the radio frequency band used in the new European Global Navigation Satellite System (GNSS): Galileo. All signals are described mathematically and then simulated using Matlab language (in transmission). Results are shown thus proving the theory provided by the European Space Agency document: “Open Service Sign...

  17. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Biocellion: accelerating computer simulation of multicellular biological system models

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  19. Computer simulation of on-orbit manned maneuvering unit operations

    Science.gov (United States)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  20. Modelling of dusty plasma properties by computer simulation methods

    Energy Technology Data Exchange (ETDEWEB)

    Baimbetov, F B [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Ramazanov, T S [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Dzhumagulova, K N [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Kadyrsizov, E R [Institute for High Energy Densities of RAS, Izhorskaya 13/19, Moscow 125412 (Russian Federation); Petrov, O F [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Gavrikov, A V [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan)

    2006-04-28

    Computer simulation of dusty plasma properties is performed. The radial distribution functions, the diffusion coefficient are calculated on the basis of the Langevin dynamics. A comparison with the experimental data is made.

  1. Computer Simulation of the Impact of Cigarette Smoking On Humans

    African Journals Online (AJOL)

    2012-12-01

    . In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the ..... Secondary School curriculum in Nigeria. 3. Workshops and seminars should be.

  2. On architectural acoustic design using computer simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2004-01-01

    acoustic design process. The emphasis is put on the first three out of five phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference to the design of Bagsværd Church by Jørn Utzon. The paper...... discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper is that the application of acoustic simulation programs is most beneficial in the last of three phases but an application...... properties prior to the actual construction of a building. With the right tools applied, acoustic design can become an integral part of the architectural design process. The aim of this paper is to investigate the field of application that an acoustic simulation programme can have during an architectural...

  3. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  4. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    Science.gov (United States)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  5. Computer simulation of working stress of heat treated steel specimen

    OpenAIRE

    B. Smoljan; D. Iljkić; S. Smokvina Hanza

    2009-01-01

    Purpose: In this paper, the prediction of working stress of quenched and tempered steel has been done. The working stress was characterized by yield strength and fracture toughness. The method of computer simulation of working stress was applied in workpiece of complex form.Design/methodology/approach: Hardness distribution of quenched and tempered workpiece of complex form was predicted by computer simulation of steel quenching using a finite volume method. The algorithm of estimation of yie...

  6. A simulator for quantum computer hardware

    NARCIS (Netherlands)

    Michielsen, K.F L; de Raedt, H.A.; De Raedt, K.

    We present new examples of the use of the quantum computer (QC) emulator. For educational purposes we describe the implementation of the CNOT and Toffoli gate, two basic building blocks of a QC, on a three qubit NMR-like QC.

  7. Computer Simulations in the Science Classroom.

    Science.gov (United States)

    Richards, John; And Others

    1992-01-01

    Explorer is an interactive environment based on a constructivist epistemology of learning that integrates animated computer models with analytic capabilities for learning science. The system includes graphs, a spreadsheet, scripting, and interactive tools. Two examples involving the dynamics of colliding objects and electric circuits illustrate…

  8. Combat Simulation Using Breach Computer Language

    Science.gov (United States)

    1979-09-01

    modeling Computer language BREACH Urban warfare MOBA M0UT 20. ABSTRACT (XTantBtua oa rmveram sirfa ff racMsary and Identity by block number... MOBA Environment," Technical Memorandum 20-78, US Array Human Engineering Laboratory, Aberdeen Proving Ground, MD, July 1978 "Symposium on

  9. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  10. Studying Scientific Discovery by Computer Simulation.

    Science.gov (United States)

    1983-03-30

    scientific laws that were induced from data before any theory was available to discover the regularities. To the previous examples, we could add Gregor ...discoveries (excluding those of Mendel and Mendeleev, which we have not simulated) could have been made. The Role of Theory in Law Induction BACON’s

  11. Role of computational efficiency in process simulation

    Directory of Open Access Journals (Sweden)

    Kurt Strand

    1989-07-01

    Full Text Available It is demonstrated how efficient numerical algorithms may be combined to yield a powerful environment for analysing and simulating dynamic systems. The importance of using efficient numerical algorithms is emphasized and demonstrated through examples from the petrochemical industry.

  12. Computer Simulation Studies of Trishomocubane Heptapeptide of ...

    African Journals Online (AJOL)

    As part of an extension on the cage peptide chemistry, the present work involves an assessment of the conformational profile of trishomocubane heptapeptide of the type Ac-Ala3-Tris-Ala3-NHMe using molecular dynamics (MD) simulations. All MD protocols were explored within the framework of a molecular mechanics ...

  13. Bodies Falling with Air Resistance: Computer Simulation.

    Science.gov (United States)

    Vest, Floyd

    1982-01-01

    Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)

  14. The Simulation of an Oxidation-Reduction Titration Curve with Computer Algebra

    Science.gov (United States)

    Whiteley, Richard V., Jr.

    2015-01-01

    Although the simulation of an oxidation/reduction titration curve is an important exercise in an undergraduate course in quantitative analysis, that exercise is frequently simplified to accommodate computational limitations. With the use of readily available computer algebra systems, however, such curves for complicated systems can be generated…

  15. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  16. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  17. Techniques in micromagnetic simulation and analysis

    Science.gov (United States)

    Kumar, D.; Adeyeye, A. O.

    2017-08-01

    Advances in nanofabrication now allow us to manipulate magnetic material at micro- and nanoscales. As the steps of design, modelling and simulation typically precede that of fabrication, these improvements have also granted a significant boost to the methods of micromagnetic simulations (MSs) and analyses. The increased availability of massive computational resources has been another major contributing factor. Magnetization dynamics at micro- and nanoscale is described by the Landau-Lifshitz-Gilbert (LLG) equation, which is an ordinary differential equation (ODE) in time. Several finite difference method (FDM) and finite element method (FEM) based LLG solvers are now widely use to solve different kind of micromagnetic problems. In this review, we present a few patterns in the ways MSs are being used in the pursuit of new physics. An important objective of this review is to allow one to make a well informed decision on the details of simulation and analysis procedures needed to accomplish a given task using computational micromagnetics. We also examine the effect of different simulation parameters to underscore and extend some best practices. Lastly, we examine different methods of micromagnetic analyses which are used to process simulation results in order to extract physically meaningful and valuable information.

  18. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available -1 Fifth International Conference on Structural Engineering, Mechanics and Computation, Cape Town South Africa, 2-4 September 2013 Computational fluid dynamics simulations and validation of results M.A. Sitek, M. Cwik, M.A. Gizejowski Warsaw...

  19. Probability: Actual Trials, Computer Simulations, and Mathematical Solutions.

    Science.gov (United States)

    Walton, Karen Doyle; Walton, J. Doyle

    The purpose of this teaching unit is to approach elementary probability problems in three ways. First, actual trials are performed and results are recorded. Second, a simple computer simulation of the problem provided on diskette and written for Apple IIe and IIc computers, is run several times. Finally, the mathematical solution of the problem is…

  20. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    As a result of this, beginners are often at a loss when trying to interact with them. The simulator here proposed therefore is aimed at bridging the gap somewhat, making quantum computer simulation more accessible to novices in the field. Journal of the Nigerian Association of Mathematical Physics Vol. 10 2006: pp. 433- ...

  1. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    Science.gov (United States)

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…

  2. Computational fluid dynamics (CFD) simulation of hot air flow ...

    African Journals Online (AJOL)

    Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...

  3. Deterministic event-based simulation of universal quantum computation

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.

  4. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  5. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  6. Development of a Computer Simulation for a Car Deceleration ...

    African Journals Online (AJOL)

    This is very practical, technical, and it happens every day. In this paper, we studied the factors responsible for this event. Using a computer simulation that is based on a mathematical model, we implemented the simulation of a car braking model and showed how long it takes a car to come to rest while considering certain ...

  7. Computer Based Modelling and Simulation-Modelling and ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 4. Computer Based Modelling and Simulation-Modelling and Simulation with Probability and Throwing Dice. N K Srinivasan. General Article Volume 6 Issue 4 April 2001 pp 69-77 ...

  8. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  9. COMPUTER SIMULATION OF A STIRLING REFRIGERATING MACHINE

    Directory of Open Access Journals (Sweden)

    V.V. Trandafilov

    2015-10-01

    Full Text Available In present numerical research, the mathematical model for precise performance simulation and detailed behavior of Stirling refrigerating machine is considered. The mathematical model for alpha Stirling refrigerating machine with helium as the working fluid will be useful in optimization of these machines mechanical design. Complete non-linear mathematical model of the machine, including thermodynamics of helium, and heat transfer from the walls, as well as heat transfer and gas resistance in the regenerator is developed. Non-dimensional groups are derived, and the mathematical model is numerically solved. Important design parameters are varied and their effect on Stirling refrigerating machine performance determined. The simulation results of Stirling refrigerating machine which include heat transfer and coefficient of performance are presented.

  10. On Architectural Acoustics Design using Computer Simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2004-01-01

    room acoustic simulation programs it is now possible to subjectively analyze and evaluate acoustic properties prior to the actual construction of a facility. With the right tools applied, the acoustic design can become an integrated part of the architectural design process. The aim of the present paper...... is to investigate the field of application an acoustic simulation program can have during an architectural acoustics design process. A case study is carried out in order to represent the iterative working process of an architect. The working process is divided into five phases and represented by typical results...... in each phase ? exemplified by Bagsværd Church by Jørn Utzon - and a description of which information would be beneficial to progress in the work. Among other things the applicability as a tool giving inspiration for finding forms of structures and rooms for an architect compared with an architect without...

  11. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    profile. The pressure profile changes when small molecules partition into the bilayer and it has previously been suggested that such changes may be related to general anesthesia. MD simulations play an important role when studying the possible coupling between general anesthesia and changes...... in the pressure profile since the pressure profile cannot be measured in traditional experiments. Even so, pressure profile calculations from MD simulations are not trivial due to both fundamental and technical issues. We addressed two such issues namely the uniqueness of the pressure profile and the effect......CD belongs to the adonesine triphosphate (ATP) binding cassette (ABC) transporter family that use ATP to drive active transport of a wide variety of compounds across cell membranes. BtuCD accounts for vitamin B12 import into Escherichia coli and is one of the only ABC transporters for which a reliable...

  12. Computer Simulation of Turbulent Reactive Gas Dynamics

    Directory of Open Access Journals (Sweden)

    Bjørn H. Hjertager

    1984-10-01

    Full Text Available A simulation procedure capable of handling transient compressible flows involving combustion is presented. The method uses the velocity components and pressure as primary flow variables. The differential equations governing the flow are discretized by integration over control volumes. The integration is performed by application of up-wind differencing in a staggered grid system. The solution procedure is an extension of the SIMPLE-algorithm accounting for compressibility effects.

  13. Computer simulation of functioning of elements of security systems

    Science.gov (United States)

    Godovykh, A. V.; Stepanov, B. P.; Sheveleva, A. A.

    2017-01-01

    The article is devoted to issues of development of the informational complex for simulation of functioning of the security system elements. The complex is described from the point of view of main objectives, a design concept and an interrelation of main elements. The proposed conception of the computer simulation provides an opportunity to simulate processes of security system work for training security staff during normal and emergency operation.

  14. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    In this report is described the work effort to develop and demonstrate a software framework to support advanced process simulations to evaluate the performance of advanced power systems. Integrated into the framework are a broad range of models, analysis tools, and visualization methods that can be used for the plant evaluation. The framework provides a tightly integrated problem-solving environment, with plug-and-play functionality, and includes a hierarchy of models, ranging from fast running process models to detailed reacting CFD models. The framework places no inherent limitations on the type of physics that can be modeled, numerical techniques, or programming languages used to implement the equipment models, or the type or amount of data that can be exchanged between models. Tools are provided to analyze simulation results at multiple levels of detail, ranging from simple tabular outputs to advanced solution visualization methods. All models and tools communicate in a seamless manner. The framework can be coupled to other software frameworks that provide different modeling capabilities. Three software frameworks were developed during the course of the project. The first framework focused on simulating the performance of the DOE Low Emissions Boiler System Proof of Concept facility, an advanced pulverized-coal combustion-based power plant. The second framework targeted simulating the performance of an Integrated coal Gasification Combined Cycle - Fuel Cell Turbine (IGCC-FCT) plant configuration. The coal gasifier models included both CFD and process models for the commercially dominant systems. Interfacing models to the framework was performed using VES-Open, and tests were performed to demonstrate interfacing CAPE-Open compliant models to the framework. The IGCC-FCT framework was subsequently extended to support Virtual Engineering concepts in which plant configurations can be constructed and interrogated in a three-dimensional, user-centered, interactive

  15. Simulation of scanning transmission electron microscope images on desktop computers

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, C., E-mail: christian.dwyer@mcem.monash.edu.au [Monash Centre for Electron Microscopy, Department of Materials Engineering, Monash University, Victoria 3800 (Australia)

    2010-02-15

    Two independent strategies are presented for reducing the computation time of multislice simulations of scanning transmission electron microscope (STEM) images: (1) optimal probe sampling, and (2) the use of desktop graphics processing units. The first strategy is applicable to STEM images generated by elastic and/or inelastic scattering, and requires minimal effort for its implementation. Used together, these two strategies can reduce typical computation times from days to hours, allowing practical simulation of STEM images of general atomic structures on a desktop computer.

  16. Computer algebra and algebraic analysis

    OpenAIRE

    Castro Jiménez, Francisco Jesús; Lambán Pardo, Laureano (Coordinador); Romero Ibáñez, Ana (Coordinador); Rubio García, Julio (Coordinador)

    2010-01-01

    Este artículo describe algunas aplicaciones del Álgebra Computacional al Análisis Algebraico, también conocido como teoría de D-módulos, es decir, el estudio algebraico de sistemas lineales de ecuaciones en derivadas parciales. Mostramos cómo calcular diferentes objetos e invariantes en teoría de D-módulos, utilizando bases de Groebner para anillos de operadores diferenciales lineales. This paper describes some applications of Computer Algebra to Algebraic Analysis also known as D-module t...

  17. Forensic Analysis of Compromised Computers

    Science.gov (United States)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  18. Computer Simulation of Breast Cancer Screening

    Science.gov (United States)

    2001-07-01

    100 200 300 400 500 600 signals at A and B may be, respectively, written as pixel position ESFA =P+S, (1) 80 60 ESFB = P + Sf2, (2) 40/ where P is the...40 / primary ratio at point A (SPR) may be computed from the -60 digital signal values (among other ways) as: -80,... .. . S=2X( ESFA -ESFB), (3) 0...100 200 300 400 500 600 pixel position P= ESFA -S, (4) FIG. 4. (a) Matched primary-only and primary plus scatter ESFs and (b) the SPR= SIP. (5) resulting

  19. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  20. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  1. Manpower Analysis Using Discrete Simulation

    Science.gov (United States)

    2015-12-01

    levels, which focuses the analysis on mean performance of the MOEs. The stepwise regression bases initial factor selection on Akaike’s Information...important factors that describe the personnel system response (model outputs) as functions of the policy choices (simulation inputs). Multiple regressions ...development of all our models. E. METAMODEL CONSTRUCTION Using linear regression , we can derive relationships between factor settings and observed

  2. Numerical simulation of temperature and thermal stress for nuclear piping by using computational fluid dynamics analysis and Green’s function

    Energy Technology Data Exchange (ETDEWEB)

    Boo, Myung-Hwan [Korea Hydro and Nuclear Power Company, Daejeon (Korea, Republic of); Oh, Chang-Kyun; Kim, Hyun-Su [KEPCO Engineering and Construction Company, Gimcheon (Korea, Republic of); Choi, Choeng-Ryul [ELSOLTEC, Inc., Yongin (Korea, Republic of)

    2017-05-15

    Owing to the fact that thermal fatigue is a well-known damage mechanism in nuclear power plants, accurate stress and fatigue evaluation are highly important. Operating experience shows that the design condition is conservative compared to the actual one. Therefore, various fatigue monitoring methods have been extensively utilized to consider the actual operating data. However, defining the local temperature in the piping is difficult because temperature-measuring instruments are limited. The purpose of this paper is to define accurate local temperature in the piping and evaluate thermal stress using Green’s function (GF) by performing a series of computational fluid dynamics analyses considering the complex fluid conditions. Also, the thermal stress is determined by adopting GF and comparing it with that of the design condition. The fluid dynamics analysis result indicates that the fluid temperature slowly varies compared to the designed one even when the flow rate changes abruptly. In addition, the resulting thermal stress can significantly decrease when reflecting the actual temperature.

  3. Associative Memory computing power and its simulation.

    CERN Document Server

    Ancu, L S; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2015-01-01

    An important step in the ATLAS upgrade program is the installation of a tracking processor, the Fast Tracker (FTK), with the goal to identify the tracks generated from charged tracks originated by the LHC 14 TeV proton-proton. The collisions will generate thousands of hits in each layer of the silicon tracker detector and track identification is a very challenging computational problem. At the core of the FTK there is associative memory (AM) system, made with hundreds of AM ASICs chips, specifically designed to allow pattern identification in high density environments at very high speed. This component is able to organize the following steps of the track identification providing a huge computing power for a specific application. The AM system will in fact being able to reconstruct tracks in 10s of microseconds. Within the FTK team there has also been a constant effort to maintain a detailed emulation of the system, to predict the impact of single component features in the final performance and in the ATLAS da...

  4. Computational analysis of phosphopeptide binding to the polo-box domain of the mitotic kinase PLK1 using molecular dynamics simulation.

    Directory of Open Access Journals (Sweden)

    David J Huggins

    2010-08-01

    Full Text Available The Polo-Like Kinase 1 (PLK1 acts as a central regulator of mitosis and is over-expressed in a wide range of human tumours where high levels of expression correlate with a poor prognosis. PLK1 comprises two structural elements, a kinase domain and a polo-box domain (PBD. The PBD binds phosphorylated substrates to control substrate phosphorylation by the kinase domain. Although the PBD preferentially binds to phosphopeptides, it has a relatively broad sequence specificity in comparison with other phosphopeptide binding domains. We analysed the molecular determinants of recognition by performing molecular dynamics simulations of the PBD with one of its natural substrates, CDC25c. Predicted binding free energies were calculated using a molecular mechanics, Poisson-Boltzmann surface area approach. We calculated the per-residue contributions to the binding free energy change, showing that the phosphothreonine residue and the mainchain account for the vast majority of the interaction energy. This explains the very broad sequence specificity with respect to other sidechain residues. Finally, we considered the key role of bridging water molecules at the binding interface. We employed inhomogeneous fluid solvation theory to consider the free energy of water molecules on the protein surface with respect to bulk water molecules. Such an analysis highlights binding hotspots created by elimination of water molecules from hydrophobic surfaces. It also predicts that a number of water molecules are stabilized by the presence of the charged phosphate group, and that this will have a significant effect on the binding affinity. Our findings suggest a molecular rationale for the promiscuous binding of the PBD and highlight a role for bridging water molecules at the interface. We expect that this method of analysis will be very useful for probing other protein surfaces to identify binding hotspots for natural binding partners and small molecule inhibitors.

  5. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  6. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  7. Traffic simulations on parallel computers using domain decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Tentner, A.M.

    1995-12-31

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  8. Cloud Computing for Rigorous Coupled-Wave Analysis

    Directory of Open Access Journals (Sweden)

    N. L. Kazanskiy

    2012-01-01

    Full Text Available Design and analysis of complex nanophotonic and nanoelectronic structures require significant computing resources. Cloud computing infrastructure allows distributed parallel applications to achieve greater scalability and fault tolerance. The problems of effective use of high-performance computing systems for modeling and simulation of subwavelength diffraction gratings are considered. Rigorous coupled-wave analysis (RCWA is adapted to cloud computing environment. In order to accomplish this, data flow of the RCWA is analyzed and CPU-intensive operations are converted to data-intensive operations. The generated data sets are structured in accordance with the requirements of MapReduce technology.

  9. Computer simulations of adsorbed liquid crystal films

    Science.gov (United States)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.

  10. Osmosis : a molecular dynamics computer simulation study

    Science.gov (United States)

    Lion, Thomas

    Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

  11. Teaching Physics (and Some Computation) Using Intentionally Incorrect Simulations

    Science.gov (United States)

    Cox, Anne J.; Junkin, William F.; Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco

    2011-05-01

    Computer simulations are widely used in physics instruction because they can aid student visualization of abstract concepts, they can provide multiple representations of concepts (graphical, trajectories, charts), they can approximate real-world examples, and they can engage students interactively, all of which can enhance student understanding of physics concepts. For these reasons, we create and use simulations to teach physics,1,2 but we also want students to recognize that the simulations are only as good as the physics behind them, so we have developed a series of simulations that are intentionally incorrect, where the task is for students to find and correct the errors.3

  12. Computer simulation tests of optimized neutron powder diffractometer configurations

    Energy Technology Data Exchange (ETDEWEB)

    Cussen, L.D., E-mail: Leo@CussenConsulting.com [Cussen Consulting, 23 Burgundy Drive, Doncaster 3108 (Australia); Lieutenant, K., E-mail: Klaus.Lieutenant@helmholtz-berlin.de [Helmholtz Zentrum Berlin, Hahn-Meitner Platz 1, 14109 Berlin (Germany)

    2016-06-21

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  13. Computational algorithms to simulate the steel continuous casting

    Science.gov (United States)

    Ramírez-López, A.; Soto-Cortés, G.; Palomar-Pardavé, M.; Romero-Romo, M. A.; Aguilar-López, R.

    2010-10-01

    Computational simulation is a very powerful tool to analyze industrial processes to reduce operating risks and improve profits from equipment. The present work describes the development of some computational algorithms based on the numerical method to create a simulator for the continuous casting process, which is the most popular method to produce steel products for metallurgical industries. The kinematics of industrial processing was computationally reproduced using subroutines logically programmed. The cast steel by each strand was calculated using an iterative method nested in the main loop. The process was repeated at each time step (Δ t) to calculate the casting time, simultaneously, the steel billets produced were counted and stored. The subroutines were used for creating a computational representation of a continuous casting plant (CCP) and displaying the simulation of the steel displacement through the CCP. These algorithms have been developed to create a simulator using the programming language C++. Algorithms for computer animation of the continuous casting process were created using a graphical user interface (GUI). Finally, the simulator functionality was shown and validated by comparing with the industrial information of the steel production of three casters.

  14. Computer Simulation for Pain Management Education: A Pilot Study.

    Science.gov (United States)

    Allred, Kelly; Gerardi, Nicole

    2017-10-01

    Effective pain management is an elusive concept in acute care. Inadequate knowledge has been identified as a barrier to providing optimal pain management. This study aimed to determine student perceptions of an interactive computer simulation as a potential method for learning pain management, as a motivator to read and learn more about pain management, preference over traditional lecture, and its potential to change nursing practice. A post-computer simulation survey with a mixed-methods descriptive design was used in this study. A college of nursing in a large metropolitan university in the Southeast United States. A convenience sample of 30 nursing students in a Bachelor of Science nursing program. An interactive computer simulation was developed as a potential alternative method of teaching pain management to nursing students. Increases in educational gain as well as its potential to change practice were explored. Each participant was asked to complete a survey consisting of 10 standard 5-point Likert scale items and 5 open-ended questions. The survey was used to evaluate the students' perception of the simulation, specifically related to educational benefit, preference compared with traditional teaching methods, and perceived potential to change nursing practice. Data provided descriptive statistics for initial evaluation of the computer simulation. The responses on the survey suggest nursing students perceive the computer simulation to be entertaining, fun, educational, occasionally preferred over regular lecture, and with potential to change practice. Preliminary data support the use of computer simulation in educating nursing students about pain management. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  15. Using EDUCache Simulator for the Computer Architecture and Organization Course

    Directory of Open Access Journals (Sweden)

    Sasko Ristov

    2013-07-01

    Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.

  16. A study of the mechanism of action of Taka-amylase A1 on linear oligosaccharides by product analysis and computer simulation.

    Science.gov (United States)

    Suganuma, T; Matsuno, R; Ohnishi, M; Hiromi, K

    1978-08-01

    The action pattern and mechanism of the Taka-amylase A-catalyzed reaction were studied quantitatively and kinetically by product analysis, using a series of maltooligosaccharides from maltotriose (G3) to maltoheptaose (G7) labeled at the reducing end with 14C-glucose. A marked concentration dependency of the product distribution from the end-labeled oligosaccharides was found, Especially with G3 and G4 as substrates. The relative cleavage frequency at the first glycosidic bond counting from the nonreducing end of the substrate increases with increasing substrate concentration. Further product analyses with unlabeled and end-labeled G3 as substrates yielded the following findings: 1) Maltose is produced in much greater yield than glucose from unlabeled G3 at high concentration (73 mM). 2) Maltooligosaccharides higher than the starting substrate were found in the hydrolysate of labeled G3. 3) Nonreducing end-labeled maltose (G-G), which is a specific product of condensation, was found to amount to only about 4% of the total labeled maltose. Based on these findings, it was concluded that transglycosylation plays a significant role in the reaction at high concentrations of G3, although the contribution of condensation cannot be ignored. A new method for evaluating subsite affinities is proposed; it is based on the combination of the kinetic parameter (ko/Km) and the bond-cleavage distribution at a sufficiently low substrate concentration, where transglycosylation and condensation can be ignored. This method was applied to evaluate the subsite affinities of Taka-amylase A. Based on a reaction scheme which involves hydrolysis, transglycosylation and condensation, the time courses of the formation of various products were simulated, using the Runge-Kutta-Gill method. Good agreement with the experimental results was obtained.

  17. Associative Memory computing power and its simulation.

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) chip is ASIC device specifically designed to perform ``pattern matching'' at very high speed and with parallel access to memory locations. The most extensive use for such device will be the ATLAS Fast Tracker (FTK) processor, where more than 8000 chips will be installed in 128 VME boards, specifically designed for high throughput in order to exploit the chip's features. Each AM chip will store a database of about 130000 pre-calculated patterns, allowing FTK to use about 1 billion patterns for the whole system, with any data inquiry broadcast to all memory elements simultaneously within the same clock cycle (10 ns), thus data retrieval time is independent of the database size. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS FTK processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 $\\mathrm{\\mu s}$. The simulation of such a parallelized system is an extremely complex task when executed in comm...

  18. Coupling Computer-Aided Process Simulation and ...

    Science.gov (United States)

    A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable

  19. Computer simulations of equilibrium magnetization and microstructure in magnetic fluids

    Science.gov (United States)

    Rosa, A. P.; Abade, G. C.; Cunha, F. R.

    2017-09-01

    -dipole interactions, the standard method of minimum image is both accurate and computationally efficient. Otherwise, lattice sums of magnetic particle interactions are required to accelerate convergence of the equilibrium magnetization. The accuracy of the numerical code is also quantitatively verified by comparing the magnetization obtained from numerical results with asymptotic predictions of high order in the particle volume fraction, in the presence of dipole-dipole interactions. In addition, Brownian Dynamics simulations are used in order to examine magnetization relaxation of a ferrofluid and to calculate the magnetic relaxation time as a function of the magnetic particle interaction strength for a given particle volume fraction and a non-dimensional applied field. The simulations of magnetization relaxation have shown the existence of a critical value of the dipole-dipole interaction parameter. For strength of the interactions below the critical value at a given particle volume fraction, the magnetic relaxation time is close to the Brownian relaxation time and the suspension has no appreciable memory. On the other hand, for strength of dipole interactions beyond its critical value, the relaxation time increases exponentially with the strength of dipole-dipole interaction. Although we have considered equilibrium conditions, the obtained results have far-reaching implications for the analysis of magnetic suspensions under external flow.

  20. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  1. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  2. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  3. Hybrid annealing: Coupling a quantum simulator to a classical computer

    Science.gov (United States)

    Graß, Tobias; Lewenstein, Maciej

    2017-05-01

    Finding the global minimum in a rugged potential landscape is a computationally hard task, often equivalent to relevant optimization problems. Annealing strategies, either classical or quantum, explore the configuration space by evolving the system under the influence of thermal or quantum fluctuations. The thermal annealing dynamics can rapidly freeze the system into a low-energy configuration, and it can be simulated well on a classical computer, but it easily gets stuck in local minima. Quantum annealing, on the other hand, can be guaranteed to find the true ground state and can be implemented in modern quantum simulators; however, quantum adiabatic schemes become prohibitively slow in the presence of quasidegeneracies. Here, we propose a strategy which combines ideas from simulated annealing and quantum annealing. In such a hybrid algorithm, the outcome of a quantum simulator is processed on a classical device. While the quantum simulator explores the configuration space by repeatedly applying quantum fluctuations and performing projective measurements, the classical computer evaluates each configuration and enforces a lowering of the energy. We have simulated this algorithm for small instances of the random energy model, showing that it potentially outperforms both simulated thermal annealing and adiabatic quantum annealing. It becomes most efficient for problems involving many quasidegenerate ground states.

  4. Implementation of a blade element UH-60 helicopter simulation on a parallel computer architecture in real-time

    Science.gov (United States)

    Moxon, Bruce C.; Green, John A.

    1990-01-01

    A high-performance platform for development of real-time helicopter flight simulations based on a simulation development and analysis platform combining a parallel simulation development and analysis environment with a scalable multiprocessor computer system is described. Simulation functional decomposition is covered, including the sequencing and data dependency of simulation modules and simulation functional mapping to multiple processors. The multiprocessor-based implementation of a blade-element simulation of the UH-60 helicopter is presented, and a prototype developed for a TC2000 computer is generalized in order to arrive at a portable multiprocessor software architecture. It is pointed out that the proposed approach coupled with a pilot's station creates a setting in which simulation engineers, computer scientists, and pilots can work together in the design and evaluation of advanced real-time helicopter simulations.

  5. Advanced Computer Simulations of Military Incinerators

    Science.gov (United States)

    2004-12-01

    REI analysis has identified the phosphorus in the ash of the coconut shell charcoal as the main cause of the slagging problems. Reacting, two phase...consists of munitions, including mines, rockets, artillery shells , and bombs containing warfare agents stored at eight sites in the continental United...Furnace (MPF) used to decontaminate drained shells , bulk containers, and self generated wastes; and • The Deactivation Furnace System (DFS) used to

  6. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  7. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  8. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M&S) users. Performing large-scale, massively...

  9. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  10. Towards accurate quantum simulations of large systems with small computers.

    Science.gov (United States)

    Yang, Yonggang

    2017-01-24

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

  11. Improved Pyrolysis Micro reactor Design via Computational Fluid Dynamics Simulations

    Science.gov (United States)

    2017-05-23

    NUMBER (Include area code) 23 May 2017 Briefing Charts 25 April 2017 - 23 May 2017 Improved Pyrolysis Micro-reactor Design via Computational Fluid...PYROLYSIS MICRO-REACTOR DESIGN VIA COMPUTATIONAL FLUID DYNAMICS SIMULATIONS Ghanshyam L. Vaghjiani* DISTRIBUTION A: Approved for public release...History of Micro-Reactor (Chen-Source) T ≤ 1800 K S.D. Chambreau et al./International Journal of Mass Spectrometry 2000, 199, 17–27 DISTRIBUTION A

  12. Computer simulations for thorium doped tungsten crystals

    Energy Technology Data Exchange (ETDEWEB)

    Eberhard, Bernd

    2009-07-17

    set of Langevin equations, i.e. stochastic differential equations including properly chosen ''noise'' terms. A new integration scheme is derived for integrating the equations of motion, which closely resembles the well-known Velocity Verlet algorithm. As a first application of the EAM potentials, we calculate the phonon dispersion for tungsten and thorium. Furthermore, the potentials are used to derive the excess volumes of point defects, i.e. for vacancies and Th-impurities in tungsten, grain boundary structures and energies. Additionally, we take a closer look at various stacking fault energies and link the results to the potential splitting of screw dislocations in tungsten into partials. We also compare the energetic stability of screw, edge and mixed-type dislocations. Besides this, we are interested in free enthalpy differences, for which we make use of the Overlapping Distribution Method (ODM), an efficient, albeit computationally demanding, method to calculate free enthalpy differences, with which we address the question of lattice formation, vacancy formation and impurity formation at varying temperatures. (orig.)

  13. Petascale computation of multi-physics seismic simulations

    Science.gov (United States)

    Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.

    2017-04-01

    Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential

  14. Computer code analysis of steam generator in thermal-hydraulic test facility simulating nuclear power plant; Ydinvoimalaitosta kuvaavan koelaitteiston hoeyrystimien analysointi tietokoneohjelmilla

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.

    1995-12-31

    In the study three loss-of-feedwater type experiments which were preformed with the PACTEL facility has been calculated with two computer codes. The purpose of the experiments was to gain information about the behaviour of horizontal steam generator in a situation where the water level on the secondary side of the steam generator is decreasing. At the same time data that can be used in the assessment of thermal-hydraulic computer codes was assembled. The purpose of the work was to study the capabilities of two computer codes, APROS version 2.11 and RELAP5/MOD3.1, to calculate the phenomena in horizontal steam generator. In order to make the comparison of the calculation results easier the same kind of model of the steam generator was made for both codes. Only the steam generator was modelled, the rest of the facility was given for the codes as a boundary condition. (23 refs.).

  15. Quantum computer simulation using the CUDA programming model

    Science.gov (United States)

    Gutiérrez, Eladio; Romero, Sergio; Trenas, María A.; Zapata, Emilio L.

    2010-02-01

    Quantum computing emerges as a field that captures a great theoretical interest. Its simulation represents a problem with high memory and computational requirements which makes advisable the use of parallel platforms. In this work we deal with the simulation of an ideal quantum computer on the Compute Unified Device Architecture (CUDA), as such a problem can benefit from the high computational capacities of Graphics Processing Units (GPU). After all, modern GPUs are becoming very powerful computational architectures which is causing a growing interest in their application for general purpose. CUDA provides an execution model oriented towards a more general exploitation of the GPU allowing to use it as a massively parallel SIMT (Single-Instruction Multiple-Thread) multiprocessor. A simulator that takes into account memory reference locality issues is proposed, showing that the challenge of achieving a high performance depends strongly on the explicit exploitation of memory hierarchy. Several strategies have been experimentally evaluated obtaining good performance results in comparison with conventional platforms.

  16. Computer simulation of backscattering spectra from paint

    Science.gov (United States)

    Mayer, M.; Silva, T. F.

    2017-09-01

    To study the role of lateral non-homogeneity on backscattering analysis of paintings, a simplified model of paint consisting of randomly distributed spherical pigment particles embedded in oil/binder has been developed. Backscattering spectra for lead white pigment particles in linseed oil have been calculated for 3 MeV H+ at a scattering angle of 165° for pigment volume concentrations ranging from 30 vol.% to 70 vol.% using the program STRUCTNRA. For identical pigment volume concentrations the heights and shapes of the backscattering spectra depend on the diameter of the pigment particles: This is a structural ambiguity for identical mean atomic concentrations but different lateral arrangement of materials. Only for very small pigment particles the resulting spectra are close to spectra calculated supposing atomic mixing and assuming identical concentrations of all elements. Generally, a good fit can be achieved when evaluating spectra from structured materials assuming atomic mixing of all elements and laterally homogeneous depth distributions. However, the derived depth profiles are inaccurate by a factor of up to 3. The depth range affected by this structural ambiguity ranges from the surface to a depth of roughly 0.5-1 pigment particle diameters. Accurate quantitative evaluation of backscattering spectra from paintings therefore requires taking the correct microstructure of the paint layer into account.

  17. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  18. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  19. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  20. Simulation of Turing Machine with uEAC-Computable Functions

    Directory of Open Access Journals (Sweden)

    Yilin Zhu

    2015-01-01

    Full Text Available The micro-Extended Analog Computer (uEAC is an electronic implementation inspired by Rubel’s EAC model. In this study, a fully connected uEACs array is proposed to overcome the limitations of a single uEAC, within which each uEAC unit is connected to all the other units by some weights. Then its computational capabilities are investigated by proving that a Turing machine M can be simulated with uEAC-computable functions, even in the presence of bounded noise.

  1. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  2. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  3. Multi-threaded, discrete event simulation of distributed computing systems

    Science.gov (United States)

    Legrand, Iosif; MONARC Collaboration

    2001-10-01

    The LHC experiments have envisaged computing systems of unprecedented complexity, for which is necessary to provide a realistic description and modeling of data access patterns, and of many jobs running concurrently on large scale distributed systems and exchanging very large amounts of data. A process oriented approach for discrete event simulation is well suited to describe various activities running concurrently, as well the stochastic arrival patterns specific for such type of simulation. Threaded objects or "Active Objects" can provide a natural way to map the specific behaviour of distributed data processing into the simulation program. The simulation tool developed within MONARC is based on Java (TM) technology which provides adequate tools for developing a flexible and distributed process oriented simulation. Proper graphics tools, and ways to analyze data interactively, are essential in any simulation project. The design elements, status and features of the MONARC simulation tool are presented. The program allows realistic modeling of complex data access patterns by multiple concurrent users in large scale computing systems in a wide range of possible architectures, from centralized to highly distributed. Comparison between queuing theory and realistic client-server measurements is also presented.

  4. Improving a Computer Networks Course Using the Partov Simulation Engine

    Science.gov (United States)

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  5. Atomic Force Microscopy and Real Atomic Resolution. Simple Computer Simulations

    NARCIS (Netherlands)

    Koutsos, V.; Manias, E.; Brinke, G. ten; Hadziioannou, G.

    1994-01-01

    Using a simple computer simulation for AFM imaging in the contact mode, pictures with true and false atomic resolution are demonstrated. The surface probed consists of two f.c.c. (111) planes and an atomic vacancy is introduced in the upper layer. Changing the size of the effective tip and its

  6. Using computer simulations to improve concept formation in chemistry

    African Journals Online (AJOL)

    By incorporating more visual material into a chemistry lecture, the lecturer may succeed in restricting the overloading of the students' short-term memory, many a time the major factor leading to misconceptions. The goal of this research project was to investigate whether computer simulations used as a visually-supporting ...

  7. Computer Simulation of the Impact of Cigarette Smoking On Humans

    African Journals Online (AJOL)

    In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the next 50 years, if no government intervention is exercised to control the behaviour of smokers. The statistical indices derived from the previous article (WAJIAR Volume 4) in the series ...

  8. Solving wood chip transport problems with computer simulation.

    Science.gov (United States)

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  9. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  10. Development of computer simulation models for pedestrian subsystem impact tests

    NARCIS (Netherlands)

    Kant, R.; Konosu, A.; Ishikawa, H.

    2000-01-01

    The European Enhanced Vehicle-safety Committee (EEVC/WG10 and WG17) proposed three component subsystem tests for cars to assess pedestrian protection. The objective of this study is to develop computer simulation models of the EEVC pedestrian subsystem tests. These models are available to develop a

  11. Computer simulation of cytoskeleton-induced blebbing in lipid membranes

    DEFF Research Database (Denmark)

    Spangler, E. J.; Harvey, C. W.; Revalee, J. D.

    2011-01-01

    Blebs are balloon-shaped membrane protrusions that form during many physiological processes. Using computer simulation of a particle-based model for self-assembled lipid bilayers coupled to an elastic meshwork, we investigated the phase behavior and kinetics of blebbing. We found that blebs form...

  12. Learner Perceptions of Realism and Magic in Computer Simulations.

    Science.gov (United States)

    Hennessy, Sara; O'Shea, Tim

    1993-01-01

    Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…

  13. Scaffolding learners in designing investigation assignments for a computer simulation

    NARCIS (Netherlands)

    Vreman-de Olde, Cornelise; de Jong, Anthonius J.M.

    2006-01-01

    This study examined the effect of scaffolding students who learned by designing assignments for a computer simulation on the physics topic of alternating circuits. We compared the students' assignments and the knowledge acquired in a scaffolded group (N=23) and a non-scaffolded group (N=19). The

  14. Biology Students Building Computer Simulations Using StarLogo TNG

    Science.gov (United States)

    Smith, V. Anne; Duncan, Ishbel

    2011-01-01

    Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

  15. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  16. The acoustical history of Hagia Sophia revived through computer simulations

    DEFF Research Database (Denmark)

    Rindel, Jens Holger; Weitze, C.A.; Christensen, Claus Lynge

    2002-01-01

    The present paper deals with acoustic computer simulations of Hagia Sophia, which is characterized not only by being one of the largest worship buildings in the world, but also by – in its 1500 year history – having served three purposes: as a church, as a mosque and today as a museum...

  17. Computer simulation study of water using a fluctuating charge model

    Indian Academy of Sciences (India)

    Unknown

    Abstract. Hydrogen bonding in small water clusters is studied through computer simulation methods using a sophisticated, empirical model of interaction developed by Rick et al (S W Rick, S J Stuart and B J Berne 1994 J. Chem. Phys. 101 6141) and others. The model allows for the charges on the interacting sites to ...

  18. COMPUTER SIMULATION OF DISPERSED MATERIALS MOTION IN ROTARY TILTING FURNACES

    Directory of Open Access Journals (Sweden)

    S. L. Rovin

    2016-01-01

    Full Text Available The article presents the results of computer simulation of dispersed materials motion in rotary furnaces with an inclined axis of rotation. Has been received new data on the dynamic layer work that enhances understanding of heat and mass transfer processes occurring in the layer. 

  19. Monte Carlo simulation by computer for life-cycle costing

    Science.gov (United States)

    Gralow, F. H.; Larson, W. J.

    1969-01-01

    Prediction of behavior and support requirements during the entire life cycle of a system enables accurate cost estimates by using the Monte Carlo simulation by computer. The system reduces the ultimate cost to the procuring agency because it takes into consideration the costs of initial procurement, operation, and maintenance.

  20. Computational Simulation of a Water-Cooled Heat Pump

    Science.gov (United States)

    Bozarth, Duane

    2008-01-01

    A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

  1. Numerical analysis mathematics of scientific computing

    CERN Document Server

    Kincaid, David

    2009-01-01

    This book introduces students with diverse backgrounds to various types of mathematical analysis that are commonly needed in scientific computing. The subject of numerical analysis is treated from a mathematical point of view, offering a complete analysis of methods for scientific computing with appropriate motivations and careful proofs. In an engaging and informal style, the authors demonstrate that many computational procedures and intriguing questions of computer science arise from theorems and proofs. Algorithms are presented in pseudocode, so that students can immediately write computer

  2. Simulating the immune response on a distributed parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Castiglione, F. [Univ. of Catania (Italy); Bernaschi, M. [Via Shanghai, Rome (Italy); Succi, S. [IAC/CNR, Rome (Italy)

    1997-06-01

    The application of ideas and methods of statistical mechanics to problems of biological relevance is one of the most promising frontiers of theoretical and computational mathematical physics. Among others, the computer simulation of the immune system dynamics stands out as one of the prominent candidates for this type of investigations. In the recent years immunological research has been drawing increasing benefits from the resort to advanced mathematical modeling on modern computers. Among others, Cellular Automata (CA), i.e., fully discrete dynamical systems evolving according to boolean laws, appear to be extremely well suited to computer simulation of biological systems. A prominent example of immunological CA is represented by the Celada-Seiden automaton, that has proven capable of providing several new insights into the dynamics of the immune system response. To date, the Celada-Seiden automaton was not in a position to exploit the impressive advances of computer technology, and notably parallel processing, simply because no parallel version of this automaton had been developed yet. In this paper we fill this gap and describe a parallel version of the Celada-Seiden cellular automaton aimed at simulating the dynamic response of the immune system. Details on the parallel implementation as well as performance data on the IBM SP2 parallel platform are presented and commented on.

  3. Computer-simulated development process of Chinese characters font cognition

    Science.gov (United States)

    Chen, Jing; Mu, Zhichun; Sun, Dehui; Hu, Dunli

    2008-10-01

    The research of Chinese characters cognition is an important research aspect of cognitive science and computer science, especially artificial intelligence. In this paper, according as the traits of Chinese characters the database of Chinese characters font representations and the model of computer simulation of Chinese characters font cognition are constructed from the aspect of cognitive science. The font cognition of Chinese characters is actual a gradual process and there is the accumulation of knowledge. Through using the method of computer simulation, the development model of Chinese characters cognition was constructed. And this is the important research content of Chinese characters cognition. This model is based on self-organizing neural network and adaptive resonance theory (ART) neural network. By Combining the SOFM and ART2 network, two sets of input were trained. Through training and testing methods, the development process of Chinese characters cognition based on Chinese characters cognition was simulated. Then the results from this model and could be compared with the results that were obtained only using SOFM. By analyzing the results, this simulation suggests that the model is able to account for some empirical results. So, the model can simulate the development process of Chinese characters cognition in a way.

  4. Computer simulation and image guidance for individualised dynamic spinal stabilization.

    Science.gov (United States)

    Kantelhardt, S R; Hausen, U; Kosterhon, M; Amr, A N; Gruber, K; Giese, A

    2015-08-01

    Dynamic implants for the human spine are used to re-establish regular segmental motion. However, the results have often been unsatisfactory and complications such as screw loosening are common. Individualisation of appliances and precision implantation are needed to improve the outcome of this procedure. Computer simulation, virtual implant optimisation and image guidance were used to improve the technique. A human lumbar spine computer model was developed using multi-body simulation software. The model simulates spinal motion under load and degenerative changes. After virtual degeneration of a L4/5 segment, virtual pedicle screw-based implants were introduced. The implants' positions and properties were iteratively optimised. The resulting implant positions were used as operative plan for image guidance and finally implemented in a physical spine model. In the simulation, the introduction and optimisation of virtually designed dynamic implants could partly compensate for the effects of virtual lumbar segment degeneration. The optimised operative plan was exported to two different image-guidance systems for transfer to a physical spine model. Three-dimensional computer graphic simulation is a feasible means to develop operative plans for dynamic spinal stabilization. These operative plans can be transferred to commercially available image-guidance systems for use in implantation of physical implants in a spine model. This concept has important potential in the design of operative plans and implants for individualised dynamic spine stabilization surgery.

  5. ORGANIZATION OF INTERACTIVE LEARNING WITH MS EXCEL AS A TOOL FOR COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Natalya Vladislavovna Manyukova

    2017-06-01

    Full Text Available Purpose. The article is devoted to FSES IN the 3rd generation demands the use of interactive forms of learning. The subject of analysis are the ways of activization of cognitive activity of students. The authors set themselves the goal of selecting the best forms and means of organization of educational process in the University. Methodology. The study will form the literature review, simulation, observation and experiment. Results. The results of this work are that the authors reveal characteristics and components of the method of computer simulation. Based on the analysis of the practical application of this method was formulated the necessary stages for its implementation at the training session. As one of the instruments of implementation of the method of computer simulation, the authors consider the MS Excel program. The article provides concrete examples of using spreadsheets for computer modeling. Practical implications. The results of the study can be used in teaching different disciplines.

  6. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    Science.gov (United States)

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  7. Computational cell biology: spatiotemporal simulation of cellular events.

    Science.gov (United States)

    Slepchenko, Boris M; Schaff, James C; Carson, John H; Loew, Leslie M

    2002-01-01

    The field of computational cell biology has emerged within the past 5 years because of the need to apply disciplined computational approaches to build and test complex hypotheses on the interacting structural, physical, and chemical features that underlie intracellular processes. To meet this need, newly developed software tools allow cell biologists and biophysicists to build models and generate simulations from them. The construction of general-purpose computational approaches is especially challenging if the spatial complexity of cellular systems is to be explicitly treated. This review surveys some of the existing efforts in this field with special emphasis on a system being developed in the authors' laboratory, Virtual Cell. The theories behind both stochastic and deterministic simulations are discussed. Examples of respective applications to cell biological problems in RNA trafficking and neuronal calcium dynamics are provided to illustrate these ideas.

  8. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  9. AFFECTIVE COMPUTING AND AUGMENTED REALITY FOR CAR DRIVING SIMULATORS

    Directory of Open Access Journals (Sweden)

    Dragoș Datcu

    2017-12-01

    Full Text Available Car simulators are essential for training and for analyzing the behavior, the responses and the performance of the driver. Augmented Reality (AR is the technology that enables virtual images to be overlaid on views of the real world. Affective Computing (AC is the technology that helps reading emotions by means of computer systems, by analyzing body gestures, facial expressions, speech and physiological signals. The key aspect of the research relies on investigating novel interfaces that help building situational awareness and emotional awareness, to enable affect-driven remote collaboration in AR for car driving simulators. The problem addressed relates to the question about how to build situational awareness (using AR technology and emotional awareness (by AC technology, and how to integrate these two distinct technologies [4], into a unique affective framework for training, in a car driving simulator.

  10. Interactive virtual simulation using a 3D computer graphics model for microvascular decompression surgery.

    Science.gov (United States)

    Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko

    2012-09-01

    The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.

  11. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; hide

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  12. Computational Fluid Dynamics Analysis of Freeze Drying Process and Equipment

    OpenAIRE

    Varma, Nikhil P.

    2014-01-01

    Freeze drying is an important, but expensive, inefficient and time consuming process in the pharmaceutical, chemical and food processing industries. Computational techniques could be a very effective tool in predictive design and analysis of both freeze drying process and equipment. This work is an attempt at using Computational Fluid Dynamics(CFD) and numerical simulations as a tool for freeze drying process and equipment design. Pressure control is critical in freeze dryers, keeping in v...

  13. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  14. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  15. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  16. Two-dimensional computer simulation of high intensity proton beams

    CERN Document Server

    Lapostolle, Pierre M

    1972-01-01

    A computer program has been developed which simulates the two- dimensional transverse behaviour of a proton beam in a focusing channel. The model is represented by an assembly of a few thousand 'superparticles' acted upon by their own self-consistent electric field and an external focusing force. The evolution of the system is computed stepwise in time by successively solving Poisson's equation and Newton's law of motion. Fast Fourier transform techniques are used for speed in the solution of Poisson's equation, while extensive area weighting is utilized for the accurate evaluation of electric field components. A computer experiment has been performed on the CERN CDC 6600 computer to study the nonlinear behaviour of an intense beam in phase space, showing under certain circumstances a filamentation due to space charge and an apparent emittance growth. (14 refs).

  17. Simulating an aerospace multiprocessor. [for space guidance computers

    Science.gov (United States)

    Mallach, E. G.

    1976-01-01

    The paper describes a simulator which was used to evaluate the architecture of an aerospace multiprocessor. The simulator models interactions among the processors, memories, the central data bus, and a possible 'job stack'. Special features of the simulator are discussed, including the use of explicitly coded and individually distinguishable 'job models' instead of a statistically defined 'job mix' and a specialized Job Model Definition Language to automate the detailed coding of the models. Some results are presented which show that when the simulator was employed in conjunction with queuing theory and Markov-process analysis, more insight into system behavior was obtained than would have been with any one technique alone.

  18. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... program and a building energy performance simulation program will improve both the energy consumption data and the prediction of thermal comfort and air quality in a selected area of the building....

  19. Simulation and Analysis of Temperature Distribution and Material Properties Change of a Thermal Heat sink Undergoing Thermal Loading in a Mobile Computer

    Science.gov (United States)

    Xavier, A.; Lim, C. S.

    2015-09-01

    This paper is aimed at studying the thermal distribution and its associated effects on a thermal heat sink of a mobile computer (laptop). Possible thermal effects are investigated using Finite-Element Method with the help of a FEM software (Ansys Workbench 14). Physical changes of the structure such as temperature change and deformation are measured and are used as the basis for comparison between models of heat sinks. This paper also attempts to study the effect of thermal loading on the materials found in a heat sink hardware in terms of stresses that may arise due to physical restraints in the hardware as well as provide an optimized solution to reduce its form factor in order to be comparable to an Ultrabook class heat-sink. An optimized solution is made based on a cylindrical fin concept.

  20. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Directory of Open Access Journals (Sweden)

    Waltemath Dagmar

    2011-12-01

    Full Text Available Abstract Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML. SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s used

  1. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  2. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  3. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  4. Computational analysis of unmanned aerial vehicle (UAV)

    Science.gov (United States)

    Abudarag, Sakhr; Yagoub, Rashid; Elfatih, Hassan; Filipovic, Zoran

    2017-01-01

    A computational analysis has been performed to verify the aerodynamics properties of Unmanned Aerial Vehicle (UAV). The UAV-SUST has been designed and fabricated at the Department of Aeronautical Engineering at Sudan University of Science and Technology in order to meet the specifications required for surveillance and reconnaissance mission. It is classified as a medium range and medium endurance UAV. A commercial CFD solver is used to simulate steady and unsteady aerodynamics characteristics of the entire UAV. In addition to Lift Coefficient (CL), Drag Coefficient (CD), Pitching Moment Coefficient (CM) and Yawing Moment Coefficient (CN), the pressure and velocity contours are illustrated. The aerodynamics parameters are represented a very good agreement with the design consideration at angle of attack ranging from zero to 26 degrees. Moreover, the visualization of the velocity field and static pressure contours is indicated a satisfactory agreement with the proposed design. The turbulence is predicted by enhancing K-ω SST turbulence model within the computational fluid dynamics code.

  5. Scheduling of a computer integrated manufacturing system: A simulation study

    Directory of Open Access Journals (Sweden)

    Nadia Bhuiyan

    2011-12-01

    Full Text Available Purpose: The purpose of this paper is to study the effect of selected scheduling dispatching rules on the performance of an actual CIM system using different performance measures and to compare the results with the literature.Design/methodology/approach: To achieve this objective, a computer simulation model of the existing CIM system is developed to test the performance of different scheduling rules with respect to mean flow time, machine efficiency and total run time as performance measures.Findings: Results suggest that the system performs much better considering the machine efficiency when the initial number of parts released is maximum and the buffer size is minimum. Furthermore, considering the average flow time, the system performs much better when the selected dispatching rule is either Earliest Due Date (EDD or Shortest Process Time (SPT with buffer size of five and the initial number of parts released of eight.Research limitations/implications: In this research, some limitations are: a limited number of factors and levels were considered for the experiment set-up; however the flexibility of the model allows experimenting with additional factors and levels. In the simulation experiments of this research, three scheduling dispatching rules (First In/First Out (FIFO, EDD, SPT were used. In future research, the effect of other dispatching rules on the system performance can be compared. Some assumptions can be relaxed in future work.Practical implications: This research helps to identify the potential effect of a selected number of dispatching rules and two other factors, the number of buffers and initial number of parts released, on the performance of the existing CIM systems with different part types where the machines are the major resource constraints.Originality/value: This research is among the few to study the effect of the dispatching rules on the performance of the CIM systems with use of terminating simulation analysis. This is

  6. Event and effectiveness models for simulating computer security. [SECSIM code, in FORTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Schelonka, E.P.

    1978-01-01

    The development and application of a series of simulation codes (designated SECSIM) that are used for computer security analysis and design are described. Individual barrier characteristics are incorporated into generalized architectural reduction algorithms providing numerical indices in selected subcategories and for the system as a whole. 7 figures, 13 tables.

  7. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  8. Numerical Analysis of Multiscale Computations

    CERN Document Server

    Engquist, Björn; Tsai, Yen-Hsi R

    2012-01-01

    This book is a snapshot of current research in multiscale modeling, computations and applications. It covers fundamental mathematical theory, numerical algorithms as well as practical computational advice for analysing single and multiphysics models containing a variety of scales in time and space. Complex fluids, porous media flow and oscillatory dynamical systems are treated in some extra depth, as well as tools like analytical and numerical homogenization, and fast multipole method.

  9. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...... constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations...

  10. Computational Physics Simulation of Classical and Quantum Systems

    CERN Document Server

    Scherer, Philipp O. J

    2010-01-01

    This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills.

  11. OSL sensitivity changes during single aliquot procedures: Computer simulations

    DEFF Research Database (Denmark)

    McKeever, S.W.S.; Agersnap Larsen, N.; Bøtter-Jensen, L.

    1997-01-01

    We present computer simulations of sensitivity changes obtained during single aliquot, regeneration procedures. The simulations indicate that the sensitivity changes are the combined result of shallow trap and deep trap effects. Four separate processes have been identified. Although procedures can...... be suggested to eliminate the shallow trap effects, it appears that the deep trap effects cannot be removed. The character of the sensitivity changes which result from these effects is seen to be dependent upon several external parameters, including the extent of bleaching of the OSL signal, the laboratory...

  12. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  13. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  14. Computer methods in electric network analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saver, P.; Hajj, I.; Pai, M.; Trick, T.

    1983-06-01

    The computational algorithms utilized in power system analysis have more than just a minor overlap with those used in electronic circuit computer aided design. This paper describes the computer methods that are common to both areas and highlights the differences in application through brief examples. Recognizing this commonality has stimulated the exchange of useful techniques in both areas and has the potential of fostering new approaches to electric network analysis through the interchange of ideas.

  15. Conference “Computational Analysis and Optimization” (CAO 2011)

    CERN Document Server

    Tiihonen, Timo; Tuovinen, Tero; Numerical Methods for Differential Equations, Optimization, and Technological Problems : Dedicated to Professor P. Neittaanmäki on His 60th Birthday

    2013-01-01

    This book contains the results in numerical analysis and optimization presented at the ECCOMAS thematic conference “Computational Analysis and Optimization” (CAO 2011) held in Jyväskylä, Finland, June 9–11, 2011. Both the conference and this volume are dedicated to Professor Pekka Neittaanmäki on the occasion of his sixtieth birthday. It consists of five parts that are closely related to his scientific activities and interests: Numerical Methods for Nonlinear Problems; Reliable Methods for Computer Simulation; Analysis of Noised and Uncertain Data; Optimization Methods; Mathematical Models Generated by Modern Technological Problems. The book also includes a short biography of Professor Neittaanmäki.

  16. Computationally efficient analysis procedure for frames with ...

    African Journals Online (AJOL)

    Computationally-efficient analytical procedure that provides high-quality analysis results for two-dimensional skeletal structure with segmented (stepped) and linearly-tapered non-prismatic flexural members has been developed based on the stiffness method of structural analysis. A computer program coded in FORTRAN ...

  17. Molecular level properties of the free water surface and different organic liquid/water interfaces, as seen from ITIM analysis of computer simulation results.

    Science.gov (United States)

    Hantal, György; Darvas, Mária; Pártay, Lívia B; Horvai, George; Jedlovszky, Pál

    2010-07-21

    Molecular dynamics simulations of the interface of water with four different apolar phases, namely water vapour, liquid carbon tetrachloride, liquid dichloromethane (DCM) and liquid dichloroethane (DCE) are performed on the canonical ensemble at 298 K. The resulting configurations are analysed using the novel method of identification of the truly interfacial molecules (ITIM). Properties of the first three molecular layers of the liquid phases (e.g. width, spacing, roughness, extent of the in-layer hydrogen bonding network) as well as of the molecules constituting these layers (e.g., dynamics, orientation) are investigated in detail. In the analyses, particular attention is paid to the effect of the polarity of the non-aqueous phase and to the length scale of the effect of the vicinity of the interface on the various properties of the molecules. The obtained results show that increasing polarity of the non-aqueous phase leads to the narrowing of the interface, in spite of the fact that, at the same time, the truly interfacial layer of water gets somewhat broader. The influence of the nearby interface is found to extend only to the first molecular layer in many respects. This result is attributed to the larger space available for the truly interfacial than for the non-interfacial molecules (as the shapes of the two liquid surfaces are largely independent of each other, resulting in the presence of voids between the two phases), and to the fact that the hydrogen bonding interaction of the truly interfacial water molecules with other waters is hindered in the direction of the interface.

  18. Stochastic simulation algorithms and analysis

    CERN Document Server

    Asmussen, Soren

    2007-01-01

    Sampling-based computational methods have become a fundamental part of the numerical toolset of practitioners and researchers across an enormous number of different applied domains and academic disciplines. This book provides a broad treatment of such sampling-based methods, as well as accompanying mathematical analysis of the convergence properties of the methods discussed. The reach of the ideas is illustrated by discussing a wide range of applications and the models that have found wide usage. The first half of the book focuses on general methods, whereas the second half discusses model-specific algorithms. Given the wide range of examples, exercises and applications students, practitioners and researchers in probability, statistics, operations research, economics, finance, engineering as well as biology and chemistry and physics will find the book of value.

  19. How to simulate a universal quantum computer using negative probabilities

    Science.gov (United States)

    Hofmann, Holger F.

    2009-07-01

    The concept of negative probabilities can be used to decompose the interaction of two qubits mediated by a quantum controlled-NOT into three operations that require only classical interactions (that is, local operations and classical communication) between the qubits. For a single gate, the probabilities of the three operations are 1, 1 and -1. This decomposition can be applied in a probabilistic simulation of quantum computation by randomly choosing one of the three operations for each gate and assigning a negative statistical weight to the outcomes of sequences with an odd number of negative probability operations. The maximal exponential speed-up of a quantum computer can then be evaluated in terms of the increase in the number of sequences needed to simulate a single operation of the quantum circuit.

  20. Active adaptive sound control in a duct - A computer simulation

    Science.gov (United States)

    Burgess, J. C.

    1981-09-01

    A digital computer simulation of adaptive closed-loop control for a specific application (sound cancellation in a duct) is discussed. The principal element is an extension of Sondhi's adaptive echo canceler and Widrow's adaptive noise canceler from signal processing to control. Thus, the adaptive algorithm is based on the LMS gradient search method. The simulation demonstrates that one or more pure tones can be canceled down to the computer bit noise level (-120 dB). When additive white noise is present, pure tones can be canceled to at least 10 dB below the noise spectrum level for SNRs down to at least 0 dB. The underlying theory suggests that the algorithm allows tracking tones with amplitudes and frequencies that change more slowly with time than the adaptive filter adaptation rate. It also implies that the method can cancel narrow-band sound in the presence of spectrally overlapping broadband sound.

  1. Molecular Dynamics Computer Simulations of Multidrug RND Efflux Pumps

    OpenAIRE

    Ruggerone, Paolo; Vargiu, Attilio V.; Collu, Francesca; Fischer, Nadine; Kandt, Christian

    2013-01-01

    Over-expression of multidrug efflux pumps of the Resistance Nodulation Division (RND) protein super family counts among the main causes for microbial resistance against pharmaceuticals. Understanding the molecular basis of this process is one of the major challenges of modern biomedical research, involving a broad range of experimental and computational techniques. Here we review the current state of RND transporter investigation employing molecular dynamics simulations providing conformation...

  2. Carburizer particle dissolution in liquid cast iron – computer simulation

    Directory of Open Access Journals (Sweden)

    D. Bartocha

    2010-01-01

    Full Text Available In the paper issue of dissolution of carburizing materials (anthracite, petroleum coke and graphite particle in liquid metal and its computer simulation are presented. Relative movement rate of particle and liquid metal and thermophsical properties of carburizing materials (thermal conductivity coefficient, specific heat, thermal diffusivity, density are taken into consideration in calculations. Calculations have been carried out in aspect of metal bath carburization in metallurgical furnaces.

  3. A computer-simulated Stern-Gerlach laboratory

    CERN Document Server

    Schroeder, Daniel V

    2015-01-01

    We describe an interactive computer program that simulates Stern-Gerlach measurements on spin-1/2 and spin-1 particles. The user can design and run experiments involving successive spin measurements, illustrating incompatible observables, interference, and time evolution. The program can be used by students at a variety of levels, from non-science majors in a general interest course to physics majors in an upper-level quantum mechanics course. We give suggested homework exercises using the program at various levels.

  4. Neurosurgical simulation by interactive computer graphics on iPad.

    Science.gov (United States)

    Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki

    2014-11-01

    Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.

  5. Computer Simulation of Intergranular Stress Corrosion Cracking via Hydrogen Embrittlement

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.W.

    2000-04-01

    Computer simulation has been applied to the investigation of intergranular stress corrosion cracking in Ni-based alloys based on a hydrogen embrittlement mechanism. The simulation employs computational modules that address (a) transport and reactions of aqueous species giving rise to hydrogen generation at the liquid-metal interface, (b) solid state transport of hydrogen via intergranular and transgranular diffusion pathways, and (c) fracture due to the embrittlement of metallic bonds by hydrogen. A key focus of the computational model development has been the role of materials microstructure (precipitate particles and grain boundaries) on hydrogen transport and embrittlement. Simulation results reveal that intergranular fracture is enhanced as grain boundaries are weakened and that microstructures with grains elongated perpendicular to the stress axis are more susceptible to cracking. The presence of intergranular precipitates may be expected to either enhance or impede cracking depending on the relative distribution of hydrogen between the grain boundaries and the precipitate-matrix interfaces. Calculations of hydrogen outgassing and in gassing demonstrate a strong effect of charging method on the fracture behavior.

  6. Adjustment computations spatial data analysis

    CERN Document Server

    Ghilani, Charles D

    2011-01-01

    the complete guide to adjusting for measurement error-expanded and updated no measurement is ever exact. Adjustment Computations updates a classic, definitive text on surveying with the latest methodologies and tools for analyzing and adjusting errors with a focus on least squares adjustments, the most rigorous methodology available and the one on which accuracy standards for surveys are based. This extensively updated Fifth Edition shares new information on advances in modern software and GNSS-acquired data. Expanded sections offer a greater amount of computable problems and their worked solu

  7. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  8. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  9. COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD

    Directory of Open Access Journals (Sweden)

    Leonid Flehantov

    2017-03-01

    Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.

  10. An FPGA computing demo core for space charge simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  11. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  12. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Christopher B.; Richmond, Marshall C.

    2001-05-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

  13. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm; Tryggvason, T.

    1998-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... zones connected by open areas with pressure and buoyancy driven air flow. The two programs are interconnected in an iterative procedure. The paper shows also an evaluation of the air quality in the main area of the buildings based on CFD predictions. It is shown that an interconnection between a CFD...

  14. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2017-01-01

    This textbook presents basic numerical methods and applies them to a large variety of physical models in multiple computer experiments. Classical algorithms and more recent methods are explained. Partial differential equations are treated generally comparing important methods, and equations of motion are solved by a large number of simple as well as more sophisticated methods. Several modern algorithms for quantum wavepacket motion are compared. The first part of the book discusses the basic numerical methods, while the second part simulates classical and quantum systems. Simple but non-trivial examples from a broad range of physical topics offer readers insights into the numerical treatment but also the simulated problems. Rotational motion is studied in detail, as are simple quantum systems. A two-level system in an external field demonstrates elementary principles from quantum optics and simulation of a quantum bit. Principles of molecular dynamics are shown. Modern bounda ry element methods are presented ...

  15. Scalable High Performance Computing: Direct and Large-Eddy Turbulent Flow Simulations Using Massively Parallel Computers

    Science.gov (United States)

    Morgan, Philip E.

    2004-01-01

    This final report contains reports of research related to the tasks "Scalable High Performance Computing: Direct and Lark-Eddy Turbulent FLow Simulations Using Massively Parallel Computers" and "Devleop High-Performance Time-Domain Computational Electromagnetics Capability for RCS Prediction, Wave Propagation in Dispersive Media, and Dual-Use Applications. The discussion of Scalable High Performance Computing reports on three objectives: validate, access scalability, and apply two parallel flow solvers for three-dimensional Navier-Stokes flows; develop and validate a high-order parallel solver for Direct Numerical Simulations (DNS) and Large Eddy Simulation (LES) problems; and Investigate and develop a high-order Reynolds averaged Navier-Stokes turbulence model. The discussion of High-Performance Time-Domain Computational Electromagnetics reports on five objectives: enhancement of an electromagnetics code (CHARGE) to be able to effectively model antenna problems; utilize lessons learned in high-order/spectral solution of swirling 3D jets to apply to solving electromagnetics project; transition a high-order fluids code, FDL3DI, to be able to solve Maxwell's Equations using compact-differencing; develop and demonstrate improved radiation absorbing boundary conditions for high-order CEM; and extend high-order CEM solver to address variable material properties. The report also contains a review of work done by the systems engineer.

  16. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  17. Computer simulation for risk management: Hydrogen refueling stations and water supply of a large region

    DEFF Research Database (Denmark)

    Markert, Frank; Kozine, Igor

    2012-01-01

    by the conventional reliability analysis models and systems analysis methods. An improvement and alternative to the conventional approach is seen in using Discrete Event Simulation (DES) models that can better account for the dynamic dimensions of the systems. The paper will describe the authors’ experience......Risk management of complex environments needs the supportive tools provided by computer models and simulation. During time, various tools have been developed and been applied with different degree of success. The still lasting increase in computer power and the associated development potentials...... stimulate and promote their application within risk management. Today, computer supported models as fault trees, event trees and Bayesian networks are commonly regarded and applied as standard tools for reliability and risk practitioners. There are though some important features that hardly can be captured...

  18. Computer simulation of the fire-tube boiler hydrodynamics

    Directory of Open Access Journals (Sweden)

    Khaustov Sergei A.

    2015-01-01

    Full Text Available Finite element method was used for simulating the hydrodynamics of fire-tube boiler with the ANSYS Fluent 12.1.4 engineering simulation software. Hydrodynamic structure and volumetric temperature distribution were calculated. The results are presented in graphical form. Complete geometric model of the fire-tube boiler based on boiler drawings was considered. Obtained results are suitable for qualitative analysis of hydrodynamics and singularities identification in fire-tube boiler water shell.

  19. The Application of Simulation in Large Energy System Analysis

    Directory of Open Access Journals (Sweden)

    S.M. Divakaruni

    1985-10-01

    Full Text Available The Modular Modeling System (MMS developed by the Electric Power Research Institute (EPRI provides an efficient, economical, and user friendly computer code to engineers involved in the analysis of nuclear and fossil power plants. MMS will complement existing codes in the areas of nuclear and fossil power plant systems simulation. This paper provides a synopsis of MMS code features, development objectives, usage and results of fossil and nuclear plant simulation.

  20. The Application of Simulation in Large Energy System Analysis

    OpenAIRE

    S.M. Divakaruni

    1985-01-01

    The Modular Modeling System (MMS) developed by the Electric Power Research Institute (EPRI) provides an efficient, economical, and user friendly computer code to engineers involved in the analysis of nuclear and fossil power plants. MMS will complement existing codes in the areas of nuclear and fossil power plant systems simulation. This paper provides a synopsis of MMS code features, development objectives, usage and results of fossil and nuclear plant simulation.

  1. Explicit contact modeling for surgical computer guidance and simulation

    Science.gov (United States)

    Johnsen, S. F.; Taylor, Z. A.; Clarkson, M.; Thompson, S.; Hu, M.; Gurusamy, K.; Davidson, B.; Hawkes, D. J.; Ourselin, S.

    2012-02-01

    Realistic modelling of mechanical interactions between tissues is an important part of surgical simulation, and may become a valuable asset in surgical computer guidance. Unfortunately, it is also computationally very demanding. Explicit matrix-free FEM solvers have been shown to be a good choice for fast tissue simulation, however little work has been done on contact algorithms for such FEM solvers. This work introduces such an algorithm that is capable of handling both deformable-deformable (soft-tissue interacting with soft-tissue) and deformable-rigid (e.g. soft-tissue interacting with surgical instruments) contacts. The proposed algorithm employs responses computed with a fully matrix-free, virtual node-based version of the model first used by Taylor and Flanagan in PRONTO3D. For contact detection, a bounding-volume hierarchy (BVH) capable of identifying self collisions is introduced. The proposed BVH generation and update strategies comprise novel heuristics to minimise the number of bounding volumes visited in hierarchy update and collision detection. Aside from speed, stability was a major objective in the development of the algorithm, hence a novel method for computation of response forces from C0-continuous normals, and a gradual application of response forces from rate constraints has been devised and incorporated in the scheme. The continuity of the surface normals has advantages particularly in applications such as sliding over irregular surfaces, which occurs, e.g., in simulated breathing. The effectiveness of the scheme is demonstrated on a number of meshes derived from medical image data and artificial test cases.

  2. Improved three-dimensional nonlinear computer simulation for TWTs

    CERN Document Server

    Xu, Lin; Mo Yuan Long

    1999-01-01

    The paper covers 3D nonlinear analysis for TWTs. Based on a macro- particle model, the electron beam can be subdivided into 3D macro- particles to calculate space-charge forces using Green's function methods, and 3D large-signal $9 working equations are obtained. The numerical results for a uniform magnetic focusing field indicate that, in 3D numerical analysis, 3D space-charge forces can be substituted by 2D forces with little influence on the numerical $9 results, which greatly decreases computing time so that a 3D computer program can be easily used. (7 refs).

  3. Modelling and simulation of information systems on computer: methodological advantages.

    Science.gov (United States)

    Huet, B; Martin, J

    1980-01-01

    Modelling and simulation of information systems by the means of miniatures on computer aim at two general objectives: (a) as an aid to design and realization of information systems; and (b) a tool to improve the dialogue between the designer and the users. An operational information system has two components bound by a dynamic relationship, an information system and a behavioural system. Thanks to the behaviour system, modelling and simulation allow the designer to integrate into the projects a large proportion of the system's implicit specification. The advantages of modelling to the information system relate to: (a) The conceptual phase: initial objectives are compared with the results of simulation and sometimes modified. (b) The external specifications: simulation is particularly useful for personalising man-machine relationships in each application. (c) The internal specifications: if the miniatures are built on the concept of process, the global design and the software are tested and also the simulation refines the configuration and directs the choice of hardware. (d) The implementation: stimulation reduces costs, time and allows testing. Progress in modelling techniques will undoubtedly lead to better information systems.

  4. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

    Science.gov (United States)

    Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

    2012-01-01

    The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

  5. Computer applications for engineering/structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zaslawsky, M.; Samaddar, S.K.

    1991-01-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  6. Cluster analysis for computer workload evaluation

    CERN Document Server

    Landau, K

    1976-01-01

    An introduction to computer workload analysis is given, showing its range of application in computer centre management, system and application programming. Cluster methods are discussed which can be used in conjunction with workload data and cluster algorithms are adapted to the specific set problem. Several samples of CDC 7600- accounting-data-collected at CERN, the European Organization for Nuclear Research-underwent a cluster analysis to determine job groups. The conclusions from resource usage of typical job groups in relation to computer workload analysis are discussed. (17 refs).

  7. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  8. Numerical simulation of NQR/NMR: Applications in quantum computing.

    Science.gov (United States)

    Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C

    2011-04-01

    A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Computer simulation of methanol exchange dynamics around cations and anions

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Santanu; Dang, Liem X.

    2016-03-03

    In this paper, we present the first computer simulation of methanol exchange dynamics between the first and second solvation shells around different cations and anions. After water, methanol is the most frequently used solvent for ions. Methanol has different structural and dynamical properties than water, so its ion solvation process is different. To this end, we performed molecular dynamics simulations using polarizable potential models to describe methanol-methanol and ion-methanol interactions. In particular, we computed methanol exchange rates by employing the transition state theory, the Impey-Madden-McDonald method, the reactive flux approach, and the Grote-Hynes theory. We observed that methanol exchange occurs at a nanosecond time scale for Na+ and at a picosecond time scale for other ions. We also observed a trend in which, for like charges, the exchange rate is slower for smaller ions because they are more strongly bound to methanol. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.

  10. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim

    2004-04-28

    This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.

  11. Reduction of artifacts in computer simulation of breast Cooper's ligaments

    Science.gov (United States)

    Pokrajac, David D.; Kuperavage, Adam; Maidment, Andrew D. A.; Bakic, Predrag R.

    2016-03-01

    Anthropomorphic software breast phantoms have been introduced as a tool for quantitative validation of breast imaging systems. Efficacy of the validation results depends on the realism of phantom images. The recursive partitioning algorithm based upon the octree simulation has been demonstrated as versatile and capable of efficiently generating large number of phantoms to support virtual clinical trials of breast imaging. Previously, we have observed specific artifacts, (here labeled "dents") on the boundaries of simulated Cooper's ligaments. In this work, we have demonstrated that these "dents" result from the approximate determination of the closest simulated ligament to an examined subvolume (i.e., octree node) of the phantom. We propose a modification of the algorithm that determines the closest ligament by considering a pre-specified number of neighboring ligaments selected based upon the functions that govern the shape of ligaments simulated in the subvolume. We have qualitatively and quantitatively demonstrated that the modified algorithm can lead to elimination or reduction of dent artifacts in software phantoms. In a proof-of concept example, we simulated a 450 ml phantom with 333 compartments at 100 micrometer resolution. After the proposed modification, we corrected 148,105 dents, with an average size of 5.27 voxels (5.27nl). We have also qualitatively analyzed the corresponding improvement in the appearance of simulated mammographic images. The proposed algorithm leads to reduction of linear and star-like artifacts in simulated phantom projections, which can be attributed to dents. Analysis of a larger number of phantoms is ongoing.

  12. Computer simulation of rapid crystal growth under microgravity

    Science.gov (United States)

    Hisada, Yasuhiro; Saito, Osami; Mitachi, Koshi; Nishinaga, Tatau

    We are planning to grow a Ge single crystal under microgravity by the TR-IA rocket in 1992. The furnace temperature should be controlled so as to finish the crystal growth in a quite short time interval (about 6 min). This study deals with the computer simulation of rapid crystal growth in space to find the proper conditions for the experiment. The crystal growth process is influenced by various physical phenomena such as heat conduction, natural and Marangoni convections, phase change, and radiation from the furnace. In this study, a 2D simulation with axial symmetry is carried out, taking into account the radiation field with a specific temperature distribution of the furnace wall. The simulation program consists of four modules. The first module is applied for the calculation of the parabolic partial differential equation by using the control volume method. The second one evaluates implicitly the phase change by the enthalpy method. The third one is for computing the heat flux from surface by radiation. The last one is for calculating with the Monte Carlo method the view factors which are necessary to obtain the heat flux.

  13. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-04-25

    This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.

  14. Computable Analysis with Applications to Dynamic Systems

    NARCIS (Netherlands)

    P.J. Collins (Pieter)

    2010-01-01

    htmlabstractIn this article we develop a theory of computation for continuous mathematics. The theory is based on earlier developments of computable analysis, especially that of the school of Weihrauch, and is presented as a model of intuitionistic type theory. Every effort has been made to keep the

  15. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lent, Wineke A.M. van, E-mail: w.v.lent@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands); Deetman, Joost W., E-mail: j.deetman@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Teertstra, H. Jelle, E-mail: h.teertstra@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Muller, Sara H., E-mail: s.muller@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Hans, Erwin W., E-mail: e.w.hans@utwente.nl [University of Twente, School of Management and Governance, Dept. of Industrial Engineering and Business Intelligence Systems, Enschede (Netherlands); Harten, Wim H. van, E-mail: w.v.harten@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands)

    2012-11-15

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. Computational analysis of heat flow in computer casing

    Science.gov (United States)

    Nor Azwadi, C. S.; Goh, C. K.; Afiq Witri, M. Y.

    2012-06-01

    Reliability of a computer system is directly related to thermal management system. This is due to the fact that poor thermal management led to high temperature distribution throughout hardware components and resulting poor performance and reducing fatigue life of the package. Therefore, good cooling solutions (heat sink, fan) and proper form factor design (expandability, interchangeable of parts) is necessary to provide good thermal management in computer system. The performance of Advanced Technology Extended (ATX) and its purposed successor, Balanced Technology Extended (BTX) were compared to investigate the aforementioned factors. Simulations were conducted by using ANSYS software. Results obtained from simulations were compared with values in the datasheet obtained from manufacturers for validation purposes and it was discovered that there are more chaos region in the flow profile for ATX form factor. In contrast, BTX form factor yields a straighter flow profile. Based on the result, we can conclude that BTX form factor has better cooling capability compared to its predecessor, ATX due to the improvement of layout made in the BTX form factor. With this change, it enabled BTX form factor to be used with more advanced components which dissipate more amount of heat and also improves the acoustic performance of BTX by reducing the number of fan needed to just one unit for BTX.

  18. ANALYSIS AND COMPUTER SIMULATION OF A NATURAL ...

    African Journals Online (AJOL)

    A model has been developed to predict the outlet air temperature and air flow rate from a solar collector based on the theory of thermal buoyancy. A high capacitance solar collector directly coupled to an animal building absorbs solar radiation, which heats up air and forces entry into the building by convection. In order to ...

  19. Computer-simulated fluid dynamics of arterial perfusion in extracorporeal circulation: From reality to virtual simulation.

    Science.gov (United States)

    Fukuda, Ikuo; Osanai, Satoshi; Shirota, Minori; Inamura, Takao; Yanaoka, Hideki; Minakawa, Masahito; Fukui, Kozo

    2009-06-01

    Atheroembolism due to aortic manipulation remains an unsolved problem in surgery for thoracic aortic aneurysm. The goal of the present study is to create a computer simulation (CS) model with which to analyze blood flow in the diseased aorta. A three-dimensional glass model of the aortic arch was constructed from CT images of a normal, healthy person and a patient with transverse aortic arch aneurysm. Separately, a CS model of the curved end-hole cannula was created, and flow from the aortic cannula was recreated using a numerical simulation. Comparison of the data obtained by the glass model analyses revealed that the flow velocity and the vector of the flow around the exit of the cannula were similar to that in the CS model. A high-velocity area was observed around the cannula exit in both the glass model and the CS model. The maximum flow velocity was as large as 1.0 m/s at 20 mm from the cannula exit and remained as large as 0.5 to 0.6 m/s within 50 mm of the exit. In the aortic arch aneurysm models, the rapid jet flow from the cannula moved straight toward the lesser curvature of the transverse aortic arch. The locations and intensities of the calculated vortices were slightly different from those obtained for the glass model. The proposed CS method for the analysis of blood flow from the aortic cannulae during extracorporeal circulation can reproduce the flow velocity and flow pattern in the proximal and transverse aortic arches.

  20. Computational methods for corpus annotation and analysis

    CERN Document Server

    Lu, Xiaofei

    2014-01-01

    This book reviews computational tools for lexical, syntactic, semantic, pragmatic and discourse analysis, with instructions on how to obtain, install and use each tool. Covers studies using Natural Language Processing, and offers ideas for better integration.

  1. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  2. Computational framework for simulating fluorescence microscope images with cell populations.

    Science.gov (United States)

    Lehmussola, Antti; Ruusuvuori, Pekka; Selinummi, Jyrki; Huttunen, Heikki; Yli-Harja, Olli

    2007-07-01

    Fluorescence microscopy combined with digital imaging constructs a basic platform for numerous biomedical studies in the field of cellular imaging. As the studies relying on analysis of digital images have become popular, the validation of image processing methods used in automated image cytometry has become an important topic. Especially, the need for efficient validation has arisen from emerging high-throughput microscopy systems where manual validation is impractical. We present a simulation platform for generating synthetic images of fluorescence-stained cell populations with realistic properties. Moreover, we show that the synthetic images enable the validation of analysis methods for automated image cytometry and comparison of their performance. Finally, we suggest additional usage scenarios for the simulator. The presented simulation framework, with several user-controllable parameters, forms a versatile tool for many kinds of validation tasks, and is freely available at http://www.cs.tut.fi/sgn/csb/simcep.

  3. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  4. Electron wave collimation by conical horns : computer simulation

    NARCIS (Netherlands)

    Michielsen, K.; de Raedt, H.

    1991-01-01

    Results are presented of extensive numerical simulations of electron wave packets transmitted by horns. A detailed quantitative analysis is given of the collimation of the electron wave by horn-like devices. It is demonstrated that the electron wave collimation effect cannot be described in terms of

  5. Computer simulation of confined and flexoelectric liquid crystalline systems

    CERN Document Server

    Barmes, F

    2003-01-01

    In this Thesis, systems of confined and flexoelectric liquid crystal systems have been studied using molecular computer simulations. The aim of this work was to provide a molecular model of a bistable display cell in which switching is induced through the application of directional electric field pulses. In the first part of this Thesis, the study of confined systems of liquid crystalline particles has been addressed. Computation of the anchoring phase diagrams for three different surface interaction models showed that the hard needle wall and rod-surface potentials induce both planar and homeotropic alignment separated by a bistability region, this being stronger and wider for the rod-surface varant. The results obtained using the rod-sphere surface model, in contrast, showed that tilled surface arrangements can be induced by surface absorption mechanisms. Equivalent studies of hybrid anchored systems showed that a bend director structure can be obtained in a slab with monostable homeotropic anchoring at the...

  6. Subglacial sediment mechanics investigated by computer simulation of granular material

    DEFF Research Database (Denmark)

    Damsgaard, Anders; Egholm, David Lundbek; Tulaczyk, Slawek

    to the mechanical nonlinearity of the sediment, internal porosity changes during deformation, and associated structural and kinematic phase transitions. In this presentation, we introduce the Discrete Element Method (DEM) for particle-scale granular simulation. The DEM is fully coupled with fluid dynamics....... The numerical method is applied to better understand the mechanical properties of the subglacial sediment and its interaction with meltwater. The computational approach allows full experimental control and offers insights into the internal kinematics, stress distribution, and mechanical stability. During...... by linear-viscous sediment movement. We demonstrate how channel flanks are stabilized by the sediment frictional strength. Additionally, sediment liquefaction proves to be a possible mechanism for causing large and episodic sediment transport by water flow. Though computationally intense, our coupled...

  7. Computational strategies in the dynamic simulation of constrained flexible MBS

    Science.gov (United States)

    Amirouche, F. M. L.; Xie, M.

    1993-01-01

    This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.

  8. Approximate Bayesian computation methods for daily spatiotemporal precipitation occurrence simulation

    Science.gov (United States)

    Olson, Branden; Kleiber, William

    2017-04-01

    Stochastic precipitation generators (SPGs) produce synthetic precipitation data and are frequently used to generate inputs for physical models throughout many scientific disciplines. Especially for large data sets, statistical parameter estimation is difficult due to the high dimensionality of the likelihood function. We propose techniques to estimate SPG parameters for spatiotemporal precipitation occurrence based on an emerging set of methods called Approximate Bayesian computation (ABC), which bypass the evaluation of a likelihood function. Our statistical model employs a thresholded Gaussian process that reduces to a probit regression at single sites. We identify appropriate ABC penalization metrics for our model parameters to produce simulations whose statistical characteristics closely resemble those of the observations. Spell length metrics are appropriate for single sites, while a variogram-based metric is proposed for spatial simulations. We present numerical case studies at sites in Colorado and Iowa where the estimated statistical model adequately reproduces local and domain statistics.

  9. Computer simulation of aqueous Na-Cl electrolytes

    Energy Technology Data Exchange (ETDEWEB)

    Hummer, G. [Los Alamos National Lab., NM (United States); Soumpasis, D.M. [Max-Planck-Institut fuer Biophysikalische Chemie (Karl-Friedrich-Bonhoeffer-Institut), Goettingen (Germany); Neumann, M. [Vienna Univ. (Austria). Inst. fuer Experimentalphysik

    1993-11-01

    Equilibrium structure of aqueous Na-Cl electrolytes between 1 and 5 mol/l is studied by means of molecular dynamics computer simulation using interaction site descriptions of water and ionic components. Electrostatic interactions are treated both with the newly developed charged-clouds scheme and with Ewald summation. In the case of a 5 mol/l electrolyte, the results for pair correlations obtained by the two methods are in excellent agreement. However, the charged-clouds technique is much faster than Ewald summation and makes simulations at lower salt concentrations feasible. It is found that both ion-water and ion-ion correlation functions depend only weakly on the ionic concentration. Sodium and chloride ions exhibit only a negligible tendency to form contact pairs. In particular, no chloride ion pairs in contact are observed.

  10. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  11. Simulation of computed radiography with imaging plate detectors

    Science.gov (United States)

    Tisseur, D.; Costin, M.; Mathy, F.; Schumm, A.

    2014-02-01

    Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.

  12. Insights from molecular dynamics simulations for computational protein design.

    Science.gov (United States)

    Childers, Matthew Carter; Daggett, Valerie

    2017-02-01

    A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures.

  13. Insights from molecular dynamics simulations for computational protein design

    Science.gov (United States)

    Childers, Matthew Carter; Daggett, Valerie

    2017-01-01

    A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures. PMID:28239489

  14. Nonlinear simulations with and computational issues for NIMROD

    Energy Technology Data Exchange (ETDEWEB)

    Sovinec, C.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  15. Time-partitioning simulation models for calculation on parallel computers

    Science.gov (United States)

    Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.

    1987-01-01

    A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.

  16. Time-partitioning simulation models for calculation of parallel computers

    Science.gov (United States)

    Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.

    1987-01-01

    A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.

  17. Dilbert-Peter model of organization effectiveness: computer simulations

    CERN Document Server

    Sobkowicz, Pawel

    2010-01-01

    We provide a technical report on a computer simulation of general effectiveness of a hierarchical organization depending on two main aspects: effects of promotion to managerial levels and efforts to self-promote of individual employees, reducing their actual productivity. The combination of judgment by appearance in the promotion to higher levels of hierarchy and the Peter Principle (which states that people are promoted to their level of incompetence) results in fast declines in effectiveness of the organization. The model uses a few synthetic parameters aimed at reproduction of realistic conditions in typical multilayer organizations.

  18. Computer simulations for biological aging and sexual reproduction

    Directory of Open Access Journals (Sweden)

    STAUFFER DIETRICH

    2001-01-01

    Full Text Available The sexual version of the Penna model of biological aging, simulated since 1996, is compared here with alternative forms of reproduction as well as with models not involving aging. In particular we want to check how sexual forms of life could have evolved and won over earlier asexual forms hundreds of million years ago. This computer model is based on the mutation-accumulation theory of aging, using bits-strings to represent the genome. Its population dynamics is studied by Monte Carlo methods.

  19. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Marko Hadjina

    2015-06-01

    Full Text Available In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of suggested methodology, is investigated and described regarding its special characteristics, advantages and reasons for application, especially in shipbuilding production process. Furthermore, simulation modeling basics were described as well as suggested methodology for production process procedure. Case study of suggested methodology application for designing a robotized profile fabrication production process line is demonstrated. Selected design solution, acquired with suggested methodology was evaluated through comparison with robotized profile cutting production line installation in a specific shipyard production process. Based on obtained data from real production the simulation model was further enhanced. Finally, on grounds of this research, results and droved conclusions, directions for further research are suggested.

  20. Computer simulations of the atmospheric composition climate of Bulgaria

    Energy Technology Data Exchange (ETDEWEB)

    Gadzhev, G.; Ganev, K.; Syrakov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.

    2015-07-01

    Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 Integrated Process Rate Analysis option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)

  1. Visualization of computer architecture simulation data for system-level design space exploration

    NARCIS (Netherlands)

    Taghavi, T.; Thompson, M.; Pimentel, A.D.

    2009-01-01

    System-level computer architecture simulations create large volumes of simulation data to explore alternative architectural solutions. Interpreting and drawing conclusions from this amount of simulation results can be extremely cumbersome. In other domains that also struggle with interpreting large

  2. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  3. Computer simulation of electronic excitation in atomic collision cascades

    Energy Technology Data Exchange (ETDEWEB)

    Duvenbeck, A.

    2007-04-05

    The impact of an keV atomic particle onto a solid surface initiates a complex sequence of collisions among target atoms in a near-surface region. The temporal and spatial evolution of this atomic collision cascade leads to the emission of particles from the surface - a process usually called sputtering. In modern surface analysis the so called SIMS technology uses the flux of sputtered particles as a source of information on the microscopical stoichiometric structure in the proximity of the bombarded surface spots. By laterally varying the bombarding spot on the surface, the entire target can be scanned and chemically analyzed. However, the particle detection, which bases upon deflection in electric fields, is limited to those species that leave the surface in an ionized state. Due to the fact that the ionized fraction of the total flux of sputtered atoms often only amounts to a few percent or even less, the detection is often hampered by rather low signals. Moreover, it is well known, that the ionization probability of emitted particles does not only depend on the elementary species, but also on the local environment from which a particle leaves the surface. Therefore, the measured signals for different sputtered species do not necessarily represent the stoichiometric composition of the sample. In the literature, this phenomenon is known as the Matrix Effect in SIMS. In order to circumvent this principal shortcoming of SIMS, the present thesis develops an alternative computer simulation concept, which treats the electronic energy losses of all moving atoms as excitation sources feeding energy into the electronic sub-system of the solid. The particle kinetics determining the excitation sources are delivered by classical molecular dynamics. The excitation energy calculations are combined with a diffusive transport model to describe the spread of excitation energy from the initial point of generation. Calculation results yield a space- and time-resolved excitation

  4. Matched experimental and computational simulations of paintball eye impacts.

    Science.gov (United States)

    Kennedy, Eric A; Stitzel, Joel D; Duma, Stefan M

    2008-01-01

    Over 1200 paintball related eye injuries are treated every year in US emergency departments. These injuries can be manifested as irritation from paint splatter in the eye to catastrophic rupture of the globe. Using the Virginia Tech - Wake Forest University Eye Model, experimental paintball impacts were replicated and the experimental and computational results compared. A total of 10 paintball impacts were conducted from a range of 71.1 m/s to 112.5 m/s. All experimental tests resulted in rupture of the globe. The matched computational simulations also predicted near-failure or failure in each of the simulations, with a maximum principal stress of greater than 22.8 MPa in all scenarios, over 23 MPa for velocities above 73 m/s. Failure stress for the VT-WFU Eye Model is defined as 23 MPa. The current regulation velocity for paintballs of 91 m/s exceeds the tolerance of the eye to globe rupture and underscores the importance for eyewear in this sport.

  5. COMPUTER SIMULATION IN MECHANICS TEACHING AND LEARNING: A CASE STUDY ON STUDENTS’ UNDERSTANDING OF FORCE AND MOTION

    Directory of Open Access Journals (Sweden)

    Dyah Permata Sari

    2015-12-01

    Full Text Available The objective of this research was to develop a force and motion simulation based on the open-source Easy Java Simulation. The process of computer simulation development was done following the ADDIE model. Based on the Analysis and Design phases, the Development phase used the open-source Easy Java Simulation (EJS to develop a computer simulation with physics content that was relevant to the subtopic. Computing and communication technology continue to make an increasing impact on all aspects of education. EJS is a powerful didactic resource that gives us the ability to focus our students’ attention on the principles of physics. Using EJS, a computer simulation was created through which the motion of a particle under the action of a specific force can be studied. The implementation phase is implemented the computer simulation in the teaching and learning process. To describe the improvements in the students’ understanding of the force and motion concepts, we used a t-test to evaluate each of the four phases. These results indicated that the use of the computer simulation could improve students’ force and motion conceptual competence regarding Newton's second law of motion.

  6. Optimization of suspension smelting technology by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lilius, K.; Jokilaakso, A.; Ahokainen, T.; Teppo, O.; Yang Yongxiang [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Materials Processing and Powder Metallurgy

    1996-12-31

    An industrial-scale flash smelting furnace and waste-heat boilers have been modelled by using commercial Computational-Fluid-Dynamics software. The work has proceeded from cold gas flow to heat transfer, combustion, and two-phase flow simulations. In the present study, the modelling task has been divided into three sub-models: (1) the concentrate burner, (2) the flash smelting furnace (reaction shaft and uptake shaft), and (3) the waste-heat boiler. For the concentrate burner, the flow of the process gas and distribution air together with the concentrate or a feed mixture was simulated. Eulerian - Eulerian approach was used for the carrier gas-phase and the dispersed particle-phase. A large parametric study was carried out by simulating a laboratory scale burner with varying turbulence intensities and then extending the simulations to the industrial scale model. For the flash smelting furnace, the simulation work concentrated on gas and gas-particle two-phase flows, as well as the development of combustion model for sulphide concentrate particles. Both Eulerian and Lagrangian approaches have been utilised in describing the particle phase and the spreading of the concentrate in the reaction shaft as well as the particle tracks have been obtained. Combustion of sulphides was first approximated with gaseous combustion by using a built-in combustion model of the software. The real oxidation reactions of the concentrate particles were then coded as a user-defined sub-routine and that was tested with industrial flash smelting cases. For the waste-heat boiler, both flow and heat transfer calculations have been carried out for an old boiler and a modified boiler SULA 2 Research Programme; 23 refs.

  7. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  8. Coupled Monte Carlo simulation and Copula theory for uncertainty analysis of multiphase flow simulation models.

    Science.gov (United States)

    Jiang, Xue; Na, Jin; Lu, Wenxi; Zhang, Yu

    2017-11-01

    Simulation-optimization techniques are effective in identifying an optimal remediation strategy. Simulation models with uncertainty, primarily in the form of parameter uncertainty with different degrees of correlation, influence the reliability of the optimal remediation strategy. In this study, a coupled Monte Carlo simulation and Copula theory is proposed for uncertainty analysis of a simulation model when parameters are correlated. Using the self-adaptive weight particle swarm optimization Kriging method, a surrogate model was constructed to replace the simulation model and reduce the computational burden and time consumption resulting from repeated and multiple Monte Carlo simulations. The Akaike information criterion (AIC) and the Bayesian information criterion (BIC) were employed to identify whether the t Copula function or the Gaussian Copula is the optimal Copula function to match the relevant structure of the parameters. The results show that both the AIC and BIC values of the t Copula function are less than those of the Gaussian Copula function. This indicates that the t Copula function is the optimal function for matching the relevant structure of the parameters. The outputs of the simulation model when parameter correlation was considered and when it was ignored were compared. The results show that the amplitude of the fluctuation interval when parameter correlation was considered is less than the corresponding amplitude when parameter estimation was ignored. Moreover, it was demonstrated that considering the correlation among parameters is essential for uncertainty analysis of a simulation model, and the results of uncertainty analysis should be incorporated into the remediation strategy optimization process.

  9. Computer Simulation of Hydraulic Systems with Typical Nonlinear Characteristics

    Directory of Open Access Journals (Sweden)

    D. N. Popov

    2017-01-01

    Full Text Available The task was to synthesise an adjustable hydraulic system structure, the mathematical model of which takes into account its inherent nonlinearity. Its solution suggests using a successive computer simulations starting with a structure of the linearized stable hydraulic system, which is then complicated by including the essentially non-linear elements. The hydraulic system thus obtained may be unable to meet the Lyapunov stability criterion and be unstable. This can be eliminated through correcting elements. Control of correction results is provided according to the form of transition processes due to stepwise variation of the control signal.Computer simulation of a throttle-controlled electrohydraulic servo drive with the rotary output element illustrates the proposed method application. A constant pressure power source provides fluid feed for the drive under pressure.For drive simulation the following models were involved: the linear model, the model taking into consideration a non-linearity of the flow-dynamic characteristics of a spool-type valve, and the non-linear models that take into account the dry friction in the spool-type valve, the backlash in the steering angle sensor of the motor shaft.The paper shows possibility of damping oscillation caused by variable hydrodynamic forces through introducing a correction device.The list of references attached contains 16 sources, which were used to justify and explain certain factors of the automatic control theory and the fluid mechanics of unsteady flows.The article presents 6 block-diagrams of the electrohydraulic servo drive and their appropriate transition processes, which have been studied.

  10. Computer simulations of the atmospheric composition climate of Bulgaria

    Energy Technology Data Exchange (ETDEWEB)

    Gadzhev, G.; Ganev, K.; Syrkov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.

    2015-07-01

    Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 “Integrated Process Rate Analysis” option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment – calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)

  11. Simultaneous computation within a sequential process simulation tool

    Directory of Open Access Journals (Sweden)

    G. Endrestøl

    1989-10-01

    Full Text Available The paper describes an equation solver superstructure developed for a sequential modular dynamic process simulation system as part of a Eureka project with Norwegian and British participation. The purpose of the development was combining some of the advantages of equation based and purely sequential systems, enabling implicit treatment of key variables independent of module boundaries, and use of numerical integration techniques suitable for each individual type of variable. For training simulator applications the main advantages are gains in speed due to increased stability limits on time steps and improved consistency of simulation results. The system is split into an off-line analysis phase and an on-line equation solver. The off-line processing consists of automatic determination of the topological structure of the system connectivity from standard process description files and derivation of an optimized sparse matrix solution procedure for the resulting set of equations. The on-line routine collects equation coefficients from involved modules, solves the combined sets of structured equations, and stores the results appropriately. This method minimizes the processing cost during the actual simulation. The solver has been applied in the Veslefrikk training simulator project.

  12. Computer Simulation of Embryonic Systems: What can a ...

    Science.gov (United States)

    (1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative pr

  13. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  14. Workbench for the computer simulation of underwater gated viewing systems

    Science.gov (United States)

    Braesicke, K.; Wegner, D.; Repasi, E.

    2017-05-01

    In this paper we introduce a software tool for image based computer simulation of an underwater gated viewing system. This development is helpful as a tool for the discussion of a possible engagement of a gated viewing camera for underwater imagery. We show the modular structure of implemented input parameter sets for camera, laser and environment description and application examples of the software tool. The whole simulation includes the scene illumination through a laser pulse with its energy pulse form and length as well as the propagation of the light through the open water taking into account complex optical properties of the environment. The scene is modeled as a geometric shape with diverse reflective areas and optical surface properties submerged in the open water. The software is based on a camera model including image degradation due to diffraction, lens transmission, detector efficiency and image enhancement by digital signal processing. We will show simulation results on some example configurations. Finally we will discuss the limits of our method and give an outlook to future development.

  15. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  16. Protein adsorption on nanoparticles: model development using computer simulation.

    Science.gov (United States)

    Shao, Qing; Hall, Carol K

    2016-10-19

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  17. Protein adsorption on nanoparticles: model development using computer simulation

    Science.gov (United States)

    Shao, Qing; Hall, Carol K.

    2016-10-01

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  18. Computational simulation of bone fracture healing under inverse dynamisation.

    Science.gov (United States)

    Wilson, Cameron J; Schütz, Michael A; Epari, Devakara R

    2017-02-01

    Adaptive finite element models have allowed researchers to test hypothetical relationships between the local mechanical environment and the healing of bone fractures. However, their predictive power has not yet been demonstrated by testing hypotheses ahead of experimental testing. In this study, an established mechano-biological scheme was used in an iterative finite element simulation of sheep tibial osteotomy healing under a hypothetical fixation regime, "inverse dynamisation". Tissue distributions, interfragmentary movement and stiffness across the fracture site were compared between stiff and flexible fixation conditions and scenarios in which fixation stiffness was increased at a discrete time-point. The modelling work was conducted blind to the experimental study to be published subsequently. The simulations predicted the fastest and most direct healing under constant stiff fixation, and the slowest healing under flexible fixation. Although low fixation stiffness promoted more callus formation prior to bridging, this conferred little additional stiffness to the fracture in the first 5 weeks. Thus, while switching to stiffer fixation facilitated rapid subsequent bridging of the fracture, no advantage of inverse dynamisation could be demonstrated. In vivo data remains necessary to conclusively test this treatment protocol and this will, in turn, provide an evaluation of the model's performance. The publication of both hypotheses and their computational simulation, prior to experimental testing, offers an appealing means to test the predictive power of mechano-biological models.

  19. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-04-30

    This is the sixth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of our IGCC workbench. Preliminary CFD simulations for single stage and two stage ''generic'' gasifiers using firing conditions based on the Vision 21 reference configuration have been performed. Work is continuing on implementing an advanced slagging model into the CFD based gasifier model. An investigation into published gasification kinetics has highlighted a wide variance in predicted performance due to the choice of kinetic parameters. A plan has been outlined for developing the reactor models required to simulate the heat transfer and gas clean up equipment downstream of the gasifier. Three models that utilize the CCA software protocol have been integrated into a version of the IGCC workbench. Tests of a CCA implementation of our CFD code into the workbench demonstrated that the CCA CFD module can execute on a geographically remote PC (linked via the Internet) in a manner that is transparent to the user. Software tools to create ''walk-through'' visualizations of the flow field within a gasifier have been demonstrated.

  20. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-01-31

    This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three.

  1. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  2. Simulation of skill acquisition in sequential learning of a computer game

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Nielsen, Finn Ravnsbjerg; Rasmussen, Jens

    1995-01-01

    The paper presents some theoretical assumptions about the cognitive control mechanisms of subjects learning to play a computer game. A simulation model has been developed to investigate these assumptions. The model is an automaton, reacting to instruction-like cue action rules. The prototypical...... performances of 23 experimental subjects at succeeding levels of training are compared to the performance of the model. The findings are interpreted in terms of a general taxonomy for cognitive task analysis....

  3. A digital computer program for the dynamic interaction simulation of controls and structure (DISCOS), volume 1

    Science.gov (United States)

    Bodley, C. S.; Devers, A. D.; Park, A. C.; Frisch, H. P.

    1978-01-01

    A theoretical development and associated digital computer program system for the dynamic simulation and stability analysis of passive and actively controlled spacecraft are presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system is used to investigate total system dynamic characteristics, including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. In addition, the program system is used for designing attitude control systems and for evaluating total dynamic system performance, including time domain response and frequency domain stability analyses.

  4. Quantification of remodeling parameter sensitivity - assessed by a computer simulation model

    DEFF Research Database (Denmark)

    Thomsen, J.S.; Mosekilde, Li.; Mosekilde, Erik

    1996-01-01

    balance, resorption depth, and critical trabecular thickness. Simulations were performed for a period of 20 years starting at age 48. The analysis showed that the number of perforations and the perforation-related mass loss both exhibited large sensitivity toward variations in final resorption depth......We have used a computer simulation model to evaluate the effect of several bone remodeling parameters on vertebral cancellus bone. The menopause was chosen as the base case scenario, and the sensitivity of the model to the following parameters was investigated: activation frequency, formation...

  5. Path-length distribution of ions reflected from a solid: Theory and computer simulation

    Science.gov (United States)

    Tolmachev, A. I.; Forlano, L.

    2017-07-01

    Theoretical methods and Monte Carlo procedure are used to study path-length distributions of ions reflected from a solid. The theoretical analysis is based on the solution of the integral Chandrasekhar equation for the Laplace transform of the distribution function. A family of curves is obtained for path-length distributions at several ion energies and mass ratios of ions and target atoms. A computer code for simulation is based on the approximation of pair collisions and a gas model of solid. The simulated results are compared with the theoretical results and published data.

  6. Interactive computer analysis of nuclear backscattering spectra

    Science.gov (United States)

    Saunders, Philip A.; Ziegler, J. F.

    1983-12-01

    A review will be made of a computer-based interactive nuclear backscattering analysis system. Users without computer experience can develop moderate competence with the system after only brief instruction because of the menu-driven organization. Publishable quality figures can be obtained without any computer expertise. Among the quantities which can be displayed over the data are depth scales for any element, element identification, relative concentrations and theoretical spectra. Captions and titling can made from a selection of 30 font styles. Lettering is put on the graphs under joy-stick control such that placement is exact without needing complicated commands.

  7. Multiscale methodology for bone remodelling simulation using coupled finite element and neural network computation.

    Science.gov (United States)

    Hambli, Ridha; Katerchi, Houda; Benhamou, Claude-Laurent

    2011-02-01

    The aim of this paper is to develop a multiscale hierarchical hybrid model based on finite element analysis and neural network computation to link mesoscopic scale (trabecular network level) and macroscopic (whole bone level) to simulate the process of bone remodelling. As whole bone simulation, including the 3D reconstruction of trabecular level bone, is time consuming, finite element calculation is only performed at the macroscopic level, whilst trained neural networks are employed as numerical substitutes for the finite element code needed for the mesoscale prediction. The bone mechanical properties are updated at the macroscopic scale depending on the morphological and mechanical adaptation at the mesoscopic scale computed by the trained neural network. The digital image-based modelling technique using μ-CT and voxel finite element analysis is used to capture volume elements representative of 2 mm³ at the mesoscale level of the femoral head. The input data for the artificial neural network are a set of bone material parameters, boundary conditions and the applied stress. The output data are the updated bone properties and some trabecular bone factors. The current approach is the first model, to our knowledge, that incorporates both finite element analysis and neural network computation to rapidly simulate multilevel bone adaptation.

  8. Mathematical analysis and simulation of crop micrometeorology

    NARCIS (Netherlands)

    Chen, J.

    1984-01-01

    In crop micrometeorology the transfer of radiation, momentum, heat and mass to or from a crop canopy is studied. Simulation models for these processes do exist but are not easy to handle because of their complexity and the long computing time they need. Moreover, up to now such models can

  9. Geometry Modeling and Grid Generation for Computational Aerodynamic Simulations Around Iced Airfoils and Wings

    Science.gov (United States)

    Choo, Yung K.; Slater, John W.; Vickerman, Mary B.; VanZante, Judith F.; Wadel, Mary F. (Technical Monitor)

    2002-01-01

    Issues associated with analysis of 'icing effects' on airfoil and wing performances are discussed, along with accomplishments and efforts to overcome difficulties with ice. Because of infinite variations of ice shapes and their high degree of complexity, computational 'icing effects' studies using available software tools must address many difficulties in geometry acquisition and modeling, grid generation, and flow simulation. The value of each technology component needs to be weighed from the perspective of the entire analysis process, from geometry to flow simulation. Even though CFD codes are yet to be validated for flows over iced airfoils and wings, numerical simulation, when considered together with wind tunnel tests, can provide valuable insights into 'icing effects' and advance our understanding of the relationship between ice characteristics and their effects on performance degradation.

  10. Integrating surrogate models into subsurface simulation framework allows computation of complex reactive transport scenarios

    Science.gov (United States)

    De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael

    2017-04-01

    Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M

  11. Computer Simulations Imply Forelimb-Dominated Underwater Flight in Plesiosaurs.

    Science.gov (United States)

    Liu, Shiqiu; Smith, Adam S; Gu, Yuting; Tan, Jie; Liu, C Karen; Turk, Greg

    2015-12-01

    Plesiosaurians are an extinct group of highly derived Mesozoic marine reptiles with a global distribution that spans 135 million years from the Early Jurassic to the Late Cretaceous. During their long evolutionary history they maintained a unique body plan with two pairs of large wing-like flippers, but their locomotion has been a topic of debate for almost 200 years. Key areas of controversy have concerned the most efficient biologically possible limb stroke, e.g. whether it consisted of rowing, underwater flight, or modified underwater flight, and how the four limbs moved in relation to each other: did they move in or out of phase? Previous studies have investigated plesiosaur swimming using a variety of methods, including skeletal analysis, human swimmers, and robotics. We adopt a novel approach using a digital, three-dimensional, articulated, free-swimming plesiosaur in a simulated fluid. We generated a large number of simulations under various joint degrees of freedom to investigate how the locomotory repertoire changes under different parameters. Within the biologically possible range of limb motion, the simulated plesiosaur swims primarily with its forelimbs using an unmodified underwater flight stroke, essentially the same as turtles and penguins. In contrast, the hindlimbs provide relatively weak thrust in all simulations. We conclude that plesiosaurs were forelimb-dominated swimmers that used their hind limbs mainly for maneuverability and stability.

  12. Computer Simulations Imply Forelimb-Dominated Underwater Flight in Plesiosaurs.

    Directory of Open Access Journals (Sweden)

    Shiqiu Liu

    2015-12-01

    Full Text Available Plesiosaurians are an extinct group of highly derived Mesozoic marine reptiles with a global distribution that spans 135 million years from the Early Jurassic to the Late Cretaceous. During their long evolutionary history they maintained a unique body plan with two pairs of large wing-like flippers, but their locomotion has been a topic of debate for almost 200 years. Key areas of controversy have concerned the most efficient biologically possible limb stroke, e.g. whether it consisted of rowing, underwater flight, or modified underwater flight, and how the four limbs moved in relation to each other: did they move in or out of phase? Previous studies have investigated plesiosaur swimming using a variety of methods, including skeletal analysis, human swimmers, and robotics. We adopt a novel approach using a digital, three-dimensional, articulated, free-swimming plesiosaur in a simulated fluid. We generated a large number of simulations under various joint degrees of freedom to investigate how the locomotory repertoire changes under different parameters. Within the biologically possible range of limb motion, the simulated plesiosaur swims primarily with its forelimbs using an unmodified underwater flight stroke, essentially the same as turtles and penguins. In contrast, the hindlimbs provide relatively weak thrust in all simulations. We conclude that plesiosaurs were forelimb-dominated swimmers that used their hind limbs mainly for maneuverability and stability.

  13. Simulation of branching blood flows on parallel computers.

    Science.gov (United States)

    Yue, Xue; Hwang, Feng-Nan; Shandas, Robin; Cai, Xiao-Chuan

    2004-01-01

    We present a fully parallel nonlinearly implicit algorithm for the numerical simulation of some branching blood flow problems, which require efficient and robust solver technologies in order to handle the high nonlinearity and the complex geometry. Parallel processing is necessary because of the large number of mesh points needed to accurately discretize the system of differential equations. In this paper we introduce a parallel Newton-Krylov-Schwarz based implicit method, and software for distributed memory parallel computers, for solving the nonlinear algebraic systems arising from a Q2-Q1 finite element discretization of the incompressible Navier-Stokes equations that we use to model the blood flow in the left anterior descending coronary artery.

  14. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  15. Computer simulation of an industrial wastewater treatment process

    Energy Technology Data Exchange (ETDEWEB)

    Jenke, D.R.; Diebold, F.E.

    1985-01-01

    The computer program REDEQL.EPAK has been modified to allow for the prediction and simulation of the chemical effects of mixing 2 or more aqueous solutions and one or more solid phases. In this form the program is capable of modelling the lime neutralisation treatment process for acid mine waters. The program calculates the speciation of all influent solutions, evaluates the equilibrium composition of any mixed solution and provides the stoichiometry of the liquid and solid phases produced as a result of the mixing. The program is used to predict the optimum treatment effluent composition, to determine the amount of neutralising agent (lime) required to produce this optimum composition and to provide information which defines the mechanism controlling the treatment process.

  16. Agent-based computer simulations of language choice dynamics.

    Science.gov (United States)

    Hadzibeganovic, Tarik; Stauffer, Dietrich; Schulze, Christian

    2009-06-01

    We use agent-based Monte Carlo simulations to address the problem of language choice dynamics in a tripartite community that is linguistically homogeneous but politically divided. We observe the process of nonlocal pattern formation that causes populations to self-organize into stable antagonistic groups as a result of the local dynamics of attraction and influence between individual computational agents. Our findings uncover some of the unique properties of opinion formation in social groups when the process is affected by asymmetric noise distribution, unstable intergroup boundaries, and different migratory behaviors. Although we focus on one particular study, the proposed stochastic dynamic models can be easily generalized and applied to investigate the evolution of other complex and nonlinear features of human collective behavior.

  17. Computer simulation of randomly cross-linked polymer networks

    CERN Document Server

    Williams, T P

    2002-01-01

    In this work, Monte Carlo and Stochastic Dynamics computer simulations of mesoscale model randomly cross-linked networks were undertaken. Task parallel implementations of the lattice Monte Carlo Bond Fluctuation model and Kremer-Grest Stochastic Dynamics bead-spring continuum model were designed and used for this purpose. Lattice and continuum precursor melt systems were prepared and then cross-linked to varying degrees. The resultant networks were used to study structural changes during deformation and relaxation dynamics. The effects of a random network topology featuring a polydisperse distribution of strand lengths and an abundance of pendant chain ends, were qualitatively compared to recent published work. A preliminary investigation into the effects of temperature on the structural and dynamical properties was also undertaken. Structural changes during isotropic swelling and uniaxial deformation, revealed a pronounced non-affine deformation dependant on the degree of cross-linking. Fractal heterogeneiti...

  18. Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    W. K. Chow

    2011-01-01

    Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.

  19. COMPUTER EMULATORS AND SIMULATORS OFMEASURING INSTRUMENTS ON THE PHYSICS LESSONS

    Directory of Open Access Journals (Sweden)

    Yaroslav Yu. Dyma

    2010-10-01

    Full Text Available Prominent feature of educational physical experiment at the present stage is applications of computer equipment and special software – virtual measuring instruments. The purpose of this article – to explain, when by means of virtual instruments it is possible to lead real experience (they are emulators, and when – virtual (they are simulators. For the best understanding implementation of one laboratory experimentation with usage of the software of both types is given. As at learning physics advantage should be given to carrying out of natural experiment with learning the real phenomena and measuring of real physical quantities the most perspective examination of programs-emulators of measuring instruments for their further implantation in educational process sees.

  20. Computer simulation of cluster impact induced electronic excitation of solids

    Energy Technology Data Exchange (ETDEWEB)

    Weidtmann, B.; Hanke, S.; Duvenbeck, A. [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany); Wucher, A., E-mail: andreas.wucher@uni-deu.de [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany)

    2013-05-15

    We present a computational study of electronic excitation upon bombardment of a metal surface with cluster projectiles. Our model employs a molecular dynamics (MD) simulation to calculate the particle dynamics following the projectile impact. Kinetic excitation is implemented via two mechanisms describing the electronic energy loss of moving particles: autoionization in close binary collisions and a velocity proportional friction force resulting from direct atom–electron collisions. Two different friction models are compared with respect to the predicted sputter yields after single atom and cluster bombardment. We find that a density dependent friction coefficient leads to a significant reduction of the total energy transferred to the electronic sub-system as compared to the Lindhard friction model, thereby strongly enhancing the predicted sputter yield under cluster bombardment conditions. In contrast, the yield predicted for monoatomic projectile bombardment remains practically unchanged.

  1. Mixed-Language High-Performance Computing for Plasma Simulations

    Directory of Open Access Journals (Sweden)

    Quanming Lu

    2003-01-01

    Full Text Available Java is receiving increasing attention as the most popular platform for distributed computing. However, programmers are still reluctant to embrace Java as a tool for writing scientific and engineering applications due to its still noticeable performance drawbacks compared with other programming languages such as Fortran or C. In this paper, we present a hybrid Java/Fortran implementation of a parallel particle-in-cell (PIC algorithm for plasma simulations. In our approach, the time-consuming components of this application are designed and implemented as Fortran subroutines, while less calculation-intensive components usually involved in building the user interface are written in Java. The two types of software modules have been glued together using the Java native interface (JNI. Our mixed-language PIC code was tested and its performance compared with pure Java and Fortran versions of the same algorithm on a Sun E6500 SMP system and a Linux cluster of Pentium~III machines.

  2. Holistic Nursing Simulation: A Concept Analysis.

    Science.gov (United States)

    Cohen, Bonni S; Boni, Rebecca

    2018-03-01

    Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.

  3. Trends in Social Science: The Impact of Computational and Simulative Models

    Science.gov (United States)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  4. MD Simulations of Viruslike Particles with Supra CG Solvation Affordable to Desktop Computers.

    Science.gov (United States)

    Machado, Matı As R; González, Humberto C; Pantano, Sergio

    2017-10-10

    Viruses are tremendously efficient molecular devices that optimize the packing of genetic material using a minimalistic number of proteins to form a capsid or envelope that protects them from external threats, being also part of cell recognition, fusion, and budding machineries. Progress in experimental techniques has provided a large number of high-resolution structures of viruses and viruslike particles (VLP), while molecular dynamics simulations may furnish lively and complementary insights on the fundamental forces ruling viral assembly, stability, and dynamics. However, the large size and complexity of these macromolecular assemblies pose significant computational challenges. Alternatively, Coarse-Grained (CG) methods, which resign atomistic resolution privileging computational efficiency, can be used to characterize the dynamics of VLPs. Still, the massive amount of solvent present in empty capsids or envelopes suggests that hybrid schemes keeping a higher resolution on regions of interest (i.e., the viral proteins and their surroundings) and a progressively coarser description on the bulk may further improve efficiency. Here we introduce a mesoscale explicit water model to be used in double- or triple-scale simulations in combination with popular atomistic parameters and the CG water used by the SIRAH force field. Simulations performed on VLPs of different sizes, along with a comprehensive analysis of the PDB, indicate that most of the VLPs so far reported are amenable to be handled on a GPU-accelerated desktop computer using this simulation scheme.

  5. Space radiator simulation system analysis

    Science.gov (United States)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A transient heat transfer analysis was carried out on a space radiator heat rejection system exposed to an arbitrarily prescribed combination of aerodynamic heating, solar, albedo, and planetary radiation. A rigorous analysis was carried out for the radiation panel and tubes lying in one plane and an approximate analysis was used to extend the rigorous analysis to the case of a curved panel. The analysis permits the consideration of both gaseous and liquid coolant fluids, including liquid metals, under prescribed, time dependent inlet conditions. The analysis provided a method for predicting: (1) transient and steady-state, two dimensional temperature profiles, (2) local and total heat rejection rates, (3) coolant flow pressure in the flow channel, and (4) total system weight and protection layer thickness.

  6. Computer simulation of chemical reactions in porous materials

    Science.gov (United States)

    Turner, Christoffer Heath

    Understanding reactions in nanoporous materials from a purely experimental perspective is a difficult task. Measuring the chemical composition of a reacting system within a catalytic material is usually only accomplished through indirect methods, and it is usually impossible to distinguish between true chemical equilibrium and metastable states. In addition, measuring molecular orientation or distribution profiles within porous systems is not easily accomplished. However, molecular simulation techniques are well-suited to these challenges. With appropriate simulation techniques and realistic molecular models, it is possible to validate the dominant physical and chemical forces controlling nanoscale reactivity. Novel nanostructured catalysts and supports can be designed, optimized, and tested using high-performance computing and advanced modeling techniques in order to guide the search for next-generation catalysts---setting new targets for the materials synthesis community. We have simulated the conversion of several different equilibrium-limited reactions within microporous carbons and we find that the pore size, pore geometry, and surface chemistry are important factors for determining the reaction yield. The equilibrium-limited reactions that we have modeled include nitric oxide dimerization, ammonia synthesis, and the esterification of acetic acid, all of which show yield enhancements within microporous carbons. In conjunction with a yield enhancement of the esterification reaction, selective adsorption of ethyl acetate within carbon micropores demonstrates an efficient method for product recovery. Additionally, a new method has been developed for simulating reaction kinetics within porous materials and other heterogeneous environments. The validity of this technique is first demonstrated by reproducing the kinetics of hydrogen iodide decomposition in the gas phase, and then predictions are made within slit-shaped carbon pores and carbon nanotubes. The rate

  7. Computer simulations for minds-on learning with ``Project Spectra!''

    Science.gov (United States)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  8. Computer Monte Carlo simulation in quantitative resource estimation

    Science.gov (United States)

    Root, D.H.; Menzie, W.D.; Scott, W.A.

    1992-01-01

    The method of making quantitative assessments of mineral resources sufficiently detailed for economic analysis is outlined in three steps. The steps are (1) determination of types of deposits that may be present in an area, (2) estimation of the numbers of deposits of the permissible deposit types, and (3) combination by Monte Carlo simulation of the estimated numbers of deposits with the historical grades and tonnages of these deposits to produce a probability distribution of the quantities of contained metal. Two examples of the estimation of the number of deposits (step 2) are given. The first example is for mercury deposits in southwestern Alaska and the second is for lode tin deposits in the Seward Peninsula. The flow of the Monte Carlo simulation program is presented with particular attention to the dependencies between grades and tonnages of deposits and between grades of different metals in the same deposit. ?? 1992 Oxford University Press.

  9. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  10. Filter wheel equalization for chest radiography: a computer simulation.

    Science.gov (United States)

    Boone, J M; Duryea, J; Steiner, R M

    1995-07-01

    A chest radiographic equalization system using lung-shaped templates mounted on filter wheels is under development. Using this technique, 25 lung templates for each lung are available on two computer controlled wheels which are located in close proximity to the x-ray tube. The large magnification factor (> 10X) of the templates assures low-frequency equalization due to the blurring of the focal spot. A low-dose image is acquired without templates using a (generic) digital receptor, the image is analyzed, and the left and right lung fields are automatically identified using software developed for this purpose. The most appropriate left and right lung templates are independently selected and are positioned into the field of view at the proper location under computer control. Once the templates are positioned, acquisition of the equalized radiographic image onto film commences at clinical exposure levels. The templates reduce the exposure to the lung fields by attenuating a fraction of the incident x-ray fluence so that the exposure to the mediastinum and diaphragm areas can be increased without overexposing the lungs. A data base of 824 digitized chest radiographs was used to determine the shape of the specific lung templates, for both left and right lung fields. A second independent data base of 208 images was used to test the performance of the templates using computer simulations. The template shape characteristics derived from the clinical image data base are demonstrated. The detected exposure in the lung fields on conventional chest radiographs was found to be, on average, three times the detected exposure behind the diaphragm and mediastinum.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim

    2004-01-28

    This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.

  12. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    Science.gov (United States)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  13. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  14. Ethical sensitivity intervention in science teacher education: Using computer simulations and professional codes of ethics

    Science.gov (United States)

    Holmes, Shawn Yvette

    A simulation was created to emulate two Racial Ethical Sensitivity Test (REST) videos (Brabeck et al., 2000). The REST is a reliable assessment for ethical sensitivity to racial and gender intolerant behaviors in educational settings. Quantitative and qualitative analysis of the REST was performed using the Quick-REST survey and an interview protocol. The purpose of this study was to affect science educator ability to recognize instances of racial and gender intolerant behaviors by levering immersive qualities of simulations. The fictitious Hazelton High School virtual environment was created by the researcher and compared with the traditional REST. The study investigated whether computer simulations can influence the ethical sensitivity of preservice and inservice science teachers to racial and gender intolerant behaviors in school settings. The post-test only research design involved 32 third-year science education students enrolled in science education classes at several southeastern universities and 31 science teachers from the same locale, some of which were part of an NSF project. Participant samples were assigned to the video control group or the simulation experimental group. This resulted in four comparison group; preservice video, preservice simulation, inservice video and inservice simulation. Participants experienced two REST scenarios in the appropriate format then responded to Quick-REST survey questions for both scenarios. Additionally, the simulation groups answered in-simulation and post-simulation questions. Nonparametric analysis of the Quick-REST ascertained differences between comparison groups. Cronbach's alpha was calculated for internal consistency. The REST interview protocol was used to analyze recognition of intolerant behaviors in the in-simulation prompts. Post-simulation prompts were analyzed for emergent themes concerning effect of the simulation on responses. The preservice video group had a significantly higher mean rank score than

  15. Work in process level definition: a method based on computer simulation and electre tri

    Directory of Open Access Journals (Sweden)

    Isaac Pergher

    2014-09-01

    Full Text Available This paper proposes a method for defining the levels of work in progress (WIP in productive environments managed by constant work in process (CONWIP policies. The proposed method combines the approaches of Computer Simulation and Electre TRI to support estimation of the adequate level of WIP and is presented in eighteen steps. The paper also presents an application example, performed on a metalworking company. The research method is based on Computer Simulation, supported by quantitative data analysis. The main contribution of the paper is its provision of a structured way to define inventories according to demand. With this method, the authors hope to contribute to the establishment of better capacity plans in production environments.

  16. Finite element simulation of the mechanical impact of computer work on the carpal tunnel syndrome.

    Science.gov (United States)

    Mouzakis, Dionysios E; Rachiotis, George; Zaoutsos, Stefanos; Eleftheriou, Andreas; Malizos, Konstantinos N

    2014-09-22

    Carpal tunnel syndrome (CTS) is a clinical disorder resulting from the compression of the median nerve. The available evidence regarding the association between computer use and CTS is controversial. There is some evidence that computer mouse or keyboard work, or both are associated with the development of CTS. Despite the availability of pressure measurements in the carpal tunnel during computer work (exposure to keyboard or mouse) there are no available data to support a direct effect of the increased intracarpal canal pressure on the median nerve. This study presents an attempt to simulate the direct effects of computer work on the whole carpal area section using finite element analysis. A finite element mesh was produced from computerized tomography scans of the carpal area, involving all tissues present in the carpal tunnel. Two loading scenarios were applied on these models based on biomechanical data measured during computer work. It was found that mouse work can produce large deformation fields on the median nerve region. Also, the high stressing effect of the carpal ligament was verified. Keyboard work produced considerable and heterogeneous elongations along the longitudinal axis of the median nerve. Our study provides evidence that increased intracarpal canal pressures caused by awkward wrist postures imposed during computer work were associated directly with deformation of the median nerve. Despite the limitations of the present study the findings could be considered as a contribution to the understanding of the development of CTS due to exposure to computer work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Using a computer simulation for teaching communication skills: A blinded multisite mixed methods randomized controlled trial.

    Science.gov (United States)

    Kron, Frederick W; Fetters, Michael D; Scerbo, Mark W; White, Casey B; Lypson, Monica L; Padilla, Miguel A; Gliva-McConvey, Gayle A; Belfore, Lee A; West, Temple; Wallace, Amelia M; Guetterman, Timothy C; Schleicher, Lauren S; Kennedy, Rebecca A; Mangrulkar, Rajesh S; Cleary, James F; Marsella, Stacy C; Becker, Daniel M

    2017-04-01

    To assess advanced communication skills among second-year medical students exposed either to a computer simulation (MPathic-VR) featuring virtual humans, or to a multimedia computer-based learning module, and to understand each group's experiences and learning preferences. A single-blinded, mixed methods, randomized, multisite trial compared MPathic-VR (N=210) to computer-based learning (N=211). Primary outcomes: communication scores during repeat interactions with MPathic-VR's intercultural and interprofessional communication scenarios and scores on a subsequent advanced communication skills objective structured clinical examination (OSCE). Multivariate analysis of variance was used to compare outcomes. student attitude surveys and qualitative assessments of their experiences with MPathic-VR or computer-based learning. MPathic-VR-trained students improved their intercultural and interprofessional communication performance between their first and second interactions with each scenario. They also achieved significantly higher composite scores on the OSCE than computer-based learning-trained students. Attitudes and experiences were more positive among students trained with MPathic-VR, who valued its providing immediate feedback, teaching nonverbal communication skills, and preparing them for emotion-charged patient encounters. MPathic-VR was effective in training advanced communication skills and in enabling knowledge transfer into a more realistic clinical situation. MPathic-VR's virtual human simulation offers an effective and engaging means of advanced communication training. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Advanced computational simulation for design and manufacturing of lightweight material components for automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Simunovic, S.; Aramayo, G.A.; Zacharia, T. [Oak Ridge National Lab., TN (United States); Toridis, T.G. [George Washington Univ., Washington, DC (United States); Bandak, F.; Ragland, C.L. [Dept. of Transportation, Washington, DC (United States)

    1997-04-01

    Computational vehicle models for the analysis of lightweight material performance in automobiles have been developed through collaboration between Oak Ridge National Laboratory, the National Highway Transportation Safety Administration, and George Washington University. The vehicle models have been verified against experimental data obtained from vehicle collisions. The crashed vehicles were analyzed, and the main impact energy dissipation mechanisms were identified and characterized. Important structural parts were extracted and digitized and directly compared with simulation results. High-performance computing played a key role in the model development because it allowed for rapid computational simulations and model modifications. The deformation of the computational model shows a very good agreement with the experiments. This report documents the modifications made to the computational model and relates them to the observations and findings on the test vehicle. Procedural guidelines are also provided that the authors believe need to be followed to create realistic models of passenger vehicles that could be used to evaluate the performance of lightweight materials in automotive structural components.

  19. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    Science.gov (United States)

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  20. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...