WorldWideScience

Sample records for subject-specific computer simulation

  1. Subject-specific computational modeling of DBS in the PPTg area

    Science.gov (United States)

    Zitella, Laura M.; Teplitzky, Benjamin A.; Yager, Paul; Hudson, Heather M.; Brintz, Katelynn; Duchin, Yuval; Harel, Noam; Vitek, Jerrold L.; Baker, Kenneth B.; Johnson, Matthew D.

    2015-01-01

    Deep brain stimulation (DBS) in the pedunculopontine tegmental nucleus (PPTg) has been proposed to alleviate medically intractable gait difficulties associated with Parkinson's disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20 Hz, 90 μs pulse width) was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (−1.0 to −1.4 mA). These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts), which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS. PMID:26236229

  2. Subject-specific computational modeling of DBS in the PPTg area

    Directory of Open Access Journals (Sweden)

    Laura M. Zitella

    2015-07-01

    Full Text Available Deep brain stimulation (DBS in the pedunculopontine tegmental nucleus (PPTg has been proposed to alleviate medically intractable gait difficulties associated with Parkinson’s disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20Hz, 90µs pulse width was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4mA. These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts, which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.

  3. Muscle Synergies Facilitate Computational Prediction of Subject-Specific Walking Motions

    Science.gov (United States)

    Meyer, Andrew J.; Eskinazi, Ilan; Jackson, Jennifer N.; Rao, Anil V.; Patten, Carolynn; Fregly, Benjamin J.

    2016-01-01

    Researchers have explored a variety of neurorehabilitation approaches to restore normal walking function following a stroke. However, there is currently no objective means for prescribing and implementing treatments that are likely to maximize recovery of walking function for any particular patient. As a first step toward optimizing neurorehabilitation effectiveness, this study develops and evaluates a patient-specific synergy-controlled neuromusculoskeletal simulation framework that can predict walking motions for an individual post-stroke. The main question we addressed was whether driving a subject-specific neuromusculoskeletal model with muscle synergy controls (5 per leg) facilitates generation of accurate walking predictions compared to a model driven by muscle activation controls (35 per leg) or joint torque controls (5 per leg). To explore this question, we developed a subject-specific neuromusculoskeletal model of a single high-functioning hemiparetic subject using instrumented treadmill walking data collected at the subject’s self-selected speed of 0.5 m/s. The model included subject-specific representations of lower-body kinematic structure, foot–ground contact behavior, electromyography-driven muscle force generation, and neural control limitations and remaining capabilities. Using direct collocation optimal control and the subject-specific model, we evaluated the ability of the three control approaches to predict the subject’s walking kinematics and kinetics at two speeds (0.5 and 0.8 m/s) for which experimental data were available from the subject. We also evaluated whether synergy controls could predict a physically realistic gait period at one speed (1.1 m/s) for which no experimental data were available. All three control approaches predicted the subject’s walking kinematics and kinetics (including ground reaction forces) well for the model calibration speed of 0.5 m/s. However, only activation and synergy controls could predict the

  4. Three-dimensional computational modeling of subject-specific cerebrospinal fluid flow in the subarachnoid space.

    Science.gov (United States)

    Gupta, Sumeet; Soellinger, Michaela; Boesiger, Peter; Poulikakos, Dimos; Kurtcuoglu, Vartan

    2009-02-01

    This study aims at investigating three-dimensional subject-specific cerebrospinal fluid (CSF) dynamics in the inferior cranial space, the superior spinal subarachnoid space (SAS), and the fourth cerebral ventricle using a combination of a finite-volume computational fluid dynamics (CFD) approach and magnetic resonance imaging (MRI) experiments. An anatomically accurate 3D model of the entire SAS of a healthy volunteer was reconstructed from high resolution T2 weighted MRI data. Subject-specific pulsatile velocity boundary conditions were imposed at planes in the pontine cistern, cerebellomedullary cistern, and in the spinal subarachnoid space. Velocimetric MRI was used to measure the velocity field at these boundaries. A constant pressure boundary condition was imposed at the interface between the aqueduct of Sylvius and the fourth ventricle. The morphology of the SAS with its complex trabecula structures was taken into account through a novel porous media model with anisotropic permeability. The governing equations were solved using finite-volume CFD. We observed a total pressure variation from -42 Pa to 40 Pa within one cardiac cycle in the investigated domain. Maximum CSF velocities of about 15 cms occurred in the inferior section of the aqueduct, 14 cms in the left foramen of Luschka, and 9 cms in the foramen of Magendie. Flow velocities in the right foramen of Luschka were found to be significantly lower than in the left, indicating three-dimensional brain asymmetries. The flow in the cerebellomedullary cistern was found to be relatively diffusive with a peak Reynolds number (Re)=72, while the flow in the pontine cistern was primarily convective with a peak Re=386. The net volumetric flow rate in the spinal canal was found to be negligible despite CSF oscillation with substantial amplitude with a maximum volumetric flow rate of 109 mlmin. The observed transient flow patterns indicate a compliant behavior of the cranial subarachnoid space. Still, the estimated

  5. A Computational Framework to Optimize Subject-Specific Hemodialysis Blood Flow Rate to Prevent Intimal Hyperplasia

    Science.gov (United States)

    Mahmoudzadeh, Javid; Wlodarczyk, Marta; Cassel, Kevin

    2017-11-01

    Development of excessive intimal hyperplasia (IH) in the cephalic vein of renal failure patients who receive chronic hemodialysis treatment results in vascular access failure and multiple treatment complications. Specifically, cephalic arch stenosis (CAS) is known to exacerbate hypertensive blood pressure, thrombosis, and subsequent cardiovascular incidents that would necessitate costly interventional procedures with low success rates. It has been hypothesized that excessive blood flow rate post access maturation which strongly violates the venous homeostasis is the main hemodynamic factor that orchestrates the onset and development of CAS. In this article, a computational framework based on a strong coupling of computational fluid dynamics (CFD) and shape optimization is proposed that aims to identify the effective blood flow rate on a patient-specific basis that avoids the onset of CAS while providing the adequate blood flow rate required to facilitate hemodialysis. This effective flow rate can be achieved through implementation of Miller's surgical banding method after the maturation of the arteriovenous fistula and is rooted in the relaxation of wall stresses back to a homeostatic target value. The results are indicative that this optimized hemodialysis blood flow rate is, in fact, a subject-specific value that can be assessed post vascular access maturation and prior to the initiation of chronic hemodialysis treatment as a mitigative action against CAS-related access failure. This computational technology can be employed for individualized dialysis treatment.

  6. Large-scale subject-specific cerebral arterial tree modeling using automated parametric mesh generation for blood flow simulation.

    Science.gov (United States)

    Ghaffari, Mahsa; Tangen, Kevin; Alaraj, Ali; Du, Xinjian; Charbel, Fady T; Linninger, Andreas A

    2017-12-01

    In this paper, we present a novel technique for automatic parametric mesh generation of subject-specific cerebral arterial trees. This technique generates high-quality and anatomically accurate computational meshes for fast blood flow simulations extending the scope of 3D vascular modeling to a large portion of cerebral arterial trees. For this purpose, a parametric meshing procedure was developed to automatically decompose the vascular skeleton, extract geometric features and generate hexahedral meshes using a body-fitted coordinate system that optimally follows the vascular network topology. To validate the anatomical accuracy of the reconstructed vasculature, we performed statistical analysis to quantify the alignment between parametric meshes and raw vascular images using receiver operating characteristic curve. Geometric accuracy evaluation showed an agreement with area under the curves value of 0.87 between the constructed mesh and raw MRA data sets. Parametric meshing yielded on-average, 36.6% and 21.7% orthogonal and equiangular skew quality improvement over the unstructured tetrahedral meshes. The parametric meshing and processing pipeline constitutes an automated technique to reconstruct and simulate blood flow throughout a large portion of the cerebral arterial tree down to the level of pial vessels. This study is the first step towards fast large-scale subject-specific hemodynamic analysis for clinical applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Classifying visuomotor workload in a driving simulator using subject specific spatial brain patterns

    NARCIS (Netherlands)

    Dijksterhuis, Chris; de Waard, Dick; Brookhuis, Karel; Mulder, Ben L. J. M.; de Jong, Ritske

    2013-01-01

    A passive Brain Computer Interface (BCI) is a system that responds to the spontaneously produced brain activity of its user and could be used to develop interactive task support. A human-machine system that could benefit from brain-based task support is the driver-car interaction system. To

  8. Evaluation of a subject-specific female gymnast model and simulation of an uneven parallel bar swing.

    Science.gov (United States)

    Sheets, Alison L; Hubbard, Mont

    2008-11-14

    A gymnast model and forward dynamics simulation of a dismount preparation swing on the uneven parallel bars were evaluated by comparing experimental and predicted joint positions throughout the maneuver. The bar model was a linearly elastic spring with a frictional bar/hand interface, and the gymnast model consisted of torso/head, arm and two leg segments. The hips were frictionless balls and sockets, and shoulder movement was planar with passive compliant structures approximated by a parallel spring and damper. Subject-specific body segment moments of inertia, and shoulder compliance were estimated. Muscles crossing the shoulder and hip were represented as torque generators, and experiments quantified maximum instantaneous torques as functions of joint angle and angular velocity. Maximum torques were scaled by joint torque activations as functions of time to produce realistic motions. The downhill simplex method optimized activations and simulation initial conditions to minimize the difference between experimental and predicted bar-center, shoulder, hip, and ankle positions. Comparing experimental and simulated performances allowed evaluation of bar, shoulder compliance, joint torque, and gymnast models. Errors in all except the gymnast model are random, zero mean, and uncorrelated, verifying that all essential system features are represented. Although the swing simulation using the gymnast model matched experimental joint positions with a 2.15cm root-mean-squared error, errors are correlated. Correlated errors indicate that the gymnast model is not complex enough to exactly reproduce the experimental motion. Possible model improvements including a nonlinear shoulder model with active translational control and a two-segment torso would not have been identified if the objective function did not evaluate the entire system configuration throughout the motion. The model and parameters presented in this study can be effectively used to understand and improve an uneven

  9. Sensitivity of subject-specific models to Hill muscle-tendon model parameters in simulations of gait

    NARCIS (Netherlands)

    Carbone, Vincenzo; van der Krogt, Marjolein; Koopman, Hubertus F.J.M.; Verdonschot, Nicolaas Jacobus Joseph

    2016-01-01

    Subject-specific musculoskeletal (MS) models of the lower extremity are essential for applications such as predicting the effects of orthopedic surgery. We performed an extensive sensitivity analysis to assess the effects of potential errors in Hill muscle–tendon (MT) model parameters for each of

  10. Sensitivity of subject-specific models to Hill muscle-tendon model parameters in simulations of gait

    NARCIS (Netherlands)

    Carbone, V.; Krogt, M.M. van der; Koopman, H.F.J.M.; Verdonschot, N.J.

    2016-01-01

    Subject-specific musculoskeletal (MS) models of the lower extremity are essential for applications such as predicting the effects of orthopedic surgery. We performed an extensive sensitivity analysis to assess the effects of potential errors in Hill muscle-tendon (MT) model parameters for each of

  11. A novel framework for fluid/structure interaction in rapid subject specific simulations of blood flow in coronary artery bifurcations

    Directory of Open Access Journals (Sweden)

    Blagojević Milan

    2014-01-01

    Full Text Available Background/Aim. Practical difficulties, particularly long model development time, have limited the types and applicability of computational fluid dynamics simulations in numerical modeling of blood flow in serial manner. In these simulations, the most revealing flow parameters are the endothelial shear stress distribution and oscillatory shear index. The aim of this study was analyze their role in the diagnosis of the occurrence and prognosis of plaque development in coronary artery bifurcations. Methods. We developed a novel modeling technique for rapid cardiovascular hemodynamic simulations taking into account interactions between fluid domain (blood and solid domain (artery wall. Two numerical models that represent the observed subdomains of an arbitrary patient-specific coronary artery bifurcation were created using multi-slice computed tomography (MSCT coronagraphy and ultrasound measurements of blood velocity. Coronary flow using an in-house finite element solver PAK-FS was solved. Results. Overall behavior of coronary artery bifurcation during one cardiac cycle is described by: velocity, pressure, endothelial shear stress, oscillatory shear index, stress in arterial wall and nodal displacements. The places where (a endothelial shear stress is less than 1.5, and (b oscillatory shear index is very small (close or equal to 0 are prone to plaque genesis. Conclusion. Finite element simulation of fluid-structure interaction was used to investigate patient-specific flow dynamics and wall mechanics at coronary artery bifurcations. Simulation model revealed that lateral walls of the main branch and lateral walls distal to the carina are exposed to low endothelial shear stress which is a predilection site for development of atherosclerosis. This conclusion is confirmed by the low values of oscillatory shear index in those places.

  12. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    Science.gov (United States)

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  13. Effect of blasts on subject-specific computational models of skin and bone sections at various locations on the human body

    Directory of Open Access Journals (Sweden)

    Arnab Chanda

    2015-11-01

    Full Text Available Blast injuries are very common among soldiers deployed in politically unstable regions such as Afghanistan and Iraq, and also in a battle field anywhere in the world. Understanding the mechanics of interaction of blasts with the skin and bone at various parts of the human body is the key to designing effective personal protective equipment (PPE's which can mitigate blast impacts. In the current work, subject-specific 3D computational models of the skin (with the three layers namely the epidermis, dermis and the hypodermis (muscles and bone sections from various parts of the human body (such as the elbow, finger, wrist, cheek bone, forehead, shin etc. have been developed to study the effect of blast loading. Non-linear material properties have been adopted for the skin and stress impulses at the different skin layers and bone sections are estimated. To date, such an extensive study on the effect of blast loading on the human skin and bone has not been attempted. The results of this study would be indispensable for medical practitioners to understand the effect of blast trauma and plan effective post-traumatic surgical strategies, and also for developing better PPE designs for the military in the future.

  14. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  15. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  16. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  17. Reversible simulation of irreversible computation

    Science.gov (United States)

    Li, Ming; Tromp, John; Vitányi, Paul

    1998-09-01

    Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.

  18. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  19. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  20. Medical Image Processing for Fully Integrated Subject Specific Whole Brain Mesh Generation

    Directory of Open Access Journals (Sweden)

    Chih-Yang Hsu

    2015-05-01

    control in virtual reality. Subject-specific computational meshes are also a prerequisite for computer simulations of cerebral hemodynamics and the effects of traumatic brain injury.

  1. Identification of the Subject-Specific Parameters of a Hill-type Muscle-tendon Model for Simulations of Human Motion (Identificatie van subject-specifieke parameters van een Hill-type spier-pees model voor de simulatie van menselijke beweging)

    OpenAIRE

    Van Campen, Anke

    2014-01-01

    This thesis contributes to subject-specific modeling in biomechanical an alysis by (i) designing an experimental setup to obtain a more accurate subject-specific angle-moment relationship of the knee joint (chapters 4 and 5), (ii) developing an algorithm for the estimation of the muscle-t endon parameters of the actuators of the knee joint in a simulation envi ronment and comparing its performance to the performance of the algorith m of Garner and Pandy (2003), and (iii) validating the outcom...

  2. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  3. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  4. Airflow in a Multiscale Subject-Specific Breathing Human Lung Model

    CERN Document Server

    Choi, Jiwoong; Hoffman, Eric A; Tawhai, Merryn H; Lin, Ching-Long

    2013-01-01

    The airflow in a subject-specific breathing human lung is simulated with a multiscale computational fluid dynamics (CFD) lung model. The three-dimensional (3D) airway geometry beginning from the mouth to about 7 generations of airways is reconstructed from the multi-detector row computed tomography (MDCT) image at the total lung capacity (TLC). Along with the segmented lobe surfaces, we can build an anatomically-consistent one-dimensional (1D) airway tree spanning over more than 20 generations down to the terminal bronchioles, which is specific to the CT resolved airways and lobes (J Biomech 43(11): 2159-2163, 2010). We then register two lung images at TLC and the functional residual capacity (FRC) to specify subject-specific CFD flow boundary conditions and deform the airway surface mesh for a breathing lung simulation (J Comput Phys 244:168-192, 2013). The 1D airway tree bridges the 3D CT-resolved airways and the registration-derived regional ventilation in the lung parenchyma, thus a multiscale model. Larg...

  5. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  6. Evolutionary Games and Computer Simulations

    CERN Document Server

    Huberman, B A; Huberman, Bernardo A.; Glance, Natalie S.

    1993-01-01

    Abstract: The prisoner's dilemma has long been considered the paradigm for studying the emergence of cooperation among selfish individuals. Because of its importance, it has been studied through computer experiments as well as in the laboratory and by analytical means. However, there are important differences between the way a system composed of many interacting elements is simulated by a digital machine and the manner in which it behaves when studied in real experiments. In some instances, these disparities can be marked enough so as to cast doubt on the implications of cellular automata type simulations for the study of cooperation in social systems. In particular, if such a simulation imposes space-time granularity, then its ability to describe the real world may be compromised. Indeed, we show that the results of digital simulations regarding territoriality and cooperation differ greatly when time is discrete as opposed to continuous.

  7. Computer simulation of martensitic transformations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ping [Univ. of California, Berkeley, CA (United States)

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  8. Computer Simulations of Space Plasmas

    Science.gov (United States)

    Goertz, C. K.

    Even a superficial scanning of the latest issues of the Journal of Geophysical Research reveals that numerical simulation of space plasma processes is an active and growing field. The complexity and sophistication of numerically produced “data” rivals that of the real stuff. Sometimes numerical results need interpretation in terms of a simple “theory,” very much as the results of real experiments and observations do. Numerical simulation has indeed become a third independent tool of space physics, somewhere between observations and analytic theory. There is thus a strong need for textbooks and monographs that report the latest techniques and results in an easily accessible form. This book is an attempt to satisfy this need. The editors want it not only to be “proceedings of selected lectures (given) at the first ISSS (International School of Space Simulations in Kyoto, Japan, November 1-2, 1982) but rather…a form of textbook of computer simulations of space plasmas.” This is, of course, a difficult task when many authors are involved. Unavoidable redundancies and differences in notation may confuse the beginner. Some important questions, like numerical stability, are not discussed in sufficient detail. The recent book by C.K. Birdsall and A.B. Langdon (Plasma Physics via Computer Simulations, McGraw-Hill, New York, 1985) is more complete and detailed and seems more suitable as a textbook for simulations. Nevertheless, this book is useful to the beginner and the specialist because it contains not only descriptions of various numerical techniques but also many applications of simulations to space physics phenomena.

  9. Computer simulation of electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Sabchevski, S.P.; Mladenov, G.M. (Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. po Elektronika)

    1994-04-14

    Self-fields and forces as well as the local degree of space-charge neutralization in overcompensated electron beams are considered. The radial variation of the local degree of space-charge neutralization is analysed. A novel model which describes the equilibrium potential distribution in overcompensated beams is proposed and a method for computer simulation of the beam propagation is described. Results from numerical experiments which illustrate the propagation of finite emittance overneutralized beams are presented. (Author).

  10. Computer simulation of nonequilibrium processes

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  11. Computer simulation of superionic fluorides

    CERN Document Server

    Castiglione, M

    2000-01-01

    experimentally gives an indication of the correlations between nearby defects is well-reproduced. The most stringent test of simulation model transferability is presented in the studies of lead tin fluoride, in which significant 'covalent' effects are apparent. Other similarly-structured compounds are also investigated, and the reasons behind the adoption of such an unusual layered structure, and the mobility and site occupation of the anions is quantified. In this thesis the nature of ion mobility in cryolite and lead fluoride based compounds is investigated by computer simulation. The phase transition of cryolite is characterised in terms of rotation of AIF sub 6 octahedra, and the conductive properties are shown to result from diffusion of the sodium ions. The two processes appear to be unrelated. Very good agreement with NMR experimental results is found. The Pb sup 2 sup + ion has a very high polarisability, yet treatment of this property in previous simulations has been problematic. In this thesis a mor...

  12. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  13. Priority Queues for Computer Simulations

    Science.gov (United States)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  14. Achilles tendon stress is more sensitive to subject-specific geometry than subject-specific material properties: A finite element analysis.

    Science.gov (United States)

    Hansen, Wencke; Shim, Vickie B; Obst, Steven; Lloyd, David G; Newsham-West, Richard; Barrett, Rod S

    2017-05-03

    This study used subject-specific measures of three-dimensional (3D) free Achilles tendon geometry in conjunction with a finite element method to investigate the effect of variation in subject-specific geometry and subject-specific material properties on tendon stress during submaximal isometric loading. Achilles tendons of eight participants (Aged 25-35years) were scanned with freehand 3D ultrasound at rest and during a 70% maximum voluntary isometric contraction. Ultrasound images were segmented, volume rendered and transformed into subject-specific 3D finite element meshes. The mean (±SD) lengths, volumes and cross-sectional areas of the tendons at rest were 62±13mm, 3617±984mm3 and 58±11mm2 respectively. The measured tendon strain at 70% MVIC was 5.9±1.3%. Subject-specific material properties were obtained using an optimisation approach that minimised the difference between measured and modelled longitudinal free tendon strain. Generic geometry was represented by the average mesh and generic material properties were taken from the literature. Local stresses were subsequently computed for combinations of subject-specific and generic geometry and material properties. For a given geometry, changing from generic to subject-specific material properties had little effect on the stress distribution in the tendon. In contrast, changing from generic to subject-specific geometry had a 26-fold greater effect on tendon stress distribution. Overall, these findings indicate that the stress distribution experienced by the living free Achilles tendon of a young and healthy population during voluntary loading are more sensitive to variation in tendon geometry than variation in tendon material properties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. QCE : A Simulator for Quantum Computer Hardware

    NARCIS (Netherlands)

    Michielsen, Kristel; Raedt, Hans De

    2003-01-01

    The Quantum Computer Emulator (QCE) described in this paper consists of a simulator of a generic, general purpose quantum computer and a graphical user interface. The latter is used to control the simulator, to define the hardware of the quantum computer and to debug and execute quantum algorithms.

  16. Computer Simulation in Tomorrow's Schools.

    Science.gov (United States)

    Foster, David

    1984-01-01

    Suggests use of simulation as an educational strategy has promise for the school of the future; discusses specific advantages of simulations over alternative educational methods, role of microcomputers in educational simulation, and past obstacles and future promise of microcomputer simulations; and presents a literature review on effectiveness of…

  17. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    Department of Computer. Science and Automation. Indian Institute of Science. Bangalore 560 012. Email: mjt@csa.iisc.ernet.in. Computers can be used to simulate the operation of complex systems and thereby study their performance. This article introduces you to the technique of discrete event simulation through a simple ...

  18. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  19. Efficient SDH Computation In Molecular Simulations Data.

    Science.gov (United States)

    Tu, Yi-Cheng; Chen, Shaoping; Pandit, Sagar; Kumar, Anand; Grupcev, Vladimir

    2012-10-01

    Analysis of large particle or molecular simulation data is integral part of the basic-science research community. It often involves computing functions such as point-to-point interactions of particles. Spatial distance histogram (SDH) is one such vital computation in scientific discovery. SDH is frequently used to compute Radial Distribution Function (RDF), and it takes quadratic time to compute using naive approach. Naive SDH computation is even more expensive as it is computed continuously over certain period of time to analyze simulation systems. Tree-based SDH computation is a popular approach. In this paper we look at different tree-based SDH computation techniques and briefly discuss about their performance. We present different strategies to improve the performance of these techniques. Specifically, we study the density map (DM) based SDH computation techniques. A DM is essentially a grid dividing simulated space into cells (3D cubes) of equal size (volume), which can be easily implemented by augmenting a Quad-tree (or Oct-tree) index. DMs are used in various configurations to compute SDH continuously over snapshots of the simulation system. The performance improvements using some of these configurations is presented in this paper. We also present the effect of utilizing computation power of Graphics Processing Units (GPUs) in computing SDH.

  20. Computer Simulation of the Neuronal Action Potential.

    Science.gov (United States)

    Solomon, Paul R.; And Others

    1988-01-01

    A series of computer simulations of the neuronal resting and action potentials are described. Discusses the use of simulations to overcome the difficulties of traditional instruction, such as blackboard illustration, which can only illustrate these events at one point in time. Describes systems requirements necessary to run the simulations.…

  1. Computer Simulation of a Hardwood Processing Plant

    Science.gov (United States)

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  2. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  3. Subject-specific planning of femoroplasty: a combined evolutionary optimization and particle diffusion model approach.

    Science.gov (United States)

    Basafa, Ehsan; Armand, Mehran

    2014-07-18

    A potential effective treatment for prevention of osteoporotic hip fractures is augmentation of the mechanical properties of the femur by injecting it with agents such as (PMMA) bone cement - femoroplasty. The operation, however, is only in research stage and can benefit substantially from computer planning and optimization. We report the results of computational planning and optimization of the procedure for biomechanical evaluation. An evolutionary optimization method was used to optimally place the cement in finite element (FE) models of seven osteoporotic bone specimens. The optimization, with some inter-specimen variations, suggested that areas close to the cortex in the superior and inferior of the neck and supero-lateral aspect of the greater trochanter will benefit from augmentation. We then used a particle-based model for bone cement diffusion simulation to match the optimized pattern, taking into account the limitations of the actual surgery, including limited volume of injection to prevent thermal necrosis. Simulations showed that the yield load can be significantly increased by more than 30%, using only 9 ml of bone cement. This increase is comparable to previous literature reports where gross filling of the bone was employed instead, using more than 40 ml of cement. These findings, along with the differences in the optimized plans between specimens, emphasize the need for subject-specific models for effective planning of femoral augmentation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. FEL Simulation Using Distributed Computing

    Energy Technology Data Exchange (ETDEWEB)

    Einstein, Joshua [Fermilab; Bernabeu Altayo, Gerard [Fermilab; Biedron, Sandra [Ljubljana U.; Freund, Henry [Colorado State U., Fort Collins; Milton, Stephen [Colorado State U., Fort Collins; van der Slot, Peter [Colorado State U., Fort Collins

    2016-06-01

    While simulation tools are available and have been used regularly for simulating light sources, the increasing availability and lower cost of GPU-based processing opens up new opportunities. This poster highlights a method of how accelerating and parallelizing code processing through the use of COTS software interfaces.

  5. Micro-computer simulation software: A review

    Directory of Open Access Journals (Sweden)

    P.S. Kruger

    2003-12-01

    Full Text Available Simulation modelling has proved to be one of the most powerful tools available to the Operations Research Analyst. The development of micro-computer technology has reached a state of maturity where the micro-computer can provide the necessary computing power and consequently various powerful and inexpensive simulation languages for micro-computers have became available. This paper will attempt to provide an introduction to the general philosophy and characteristics of some of the available micro-computer simulation languages. The emphasis will be on the characteristics of the specific micro-computer implementation rather than on a comparison of the modelling features of the various languages. Such comparisons may be found elsewhere.

  6. Computer simulation in physics and engineering

    CERN Document Server

    Steinhauser, Martin Oliver

    2013-01-01

    This work is a needed reference for widely used techniques and methods of computer simulation in physics and other disciplines, such as materials science. The work conveys both: the theoretical foundations of computer simulation as well as applications and "tricks of the trade", that often are scattered across various papers. Thus it will meet a need and fill a gap for every scientist who needs computer simulations for his/her task at hand. In addition to being a reference, case studies and exercises for use as course reading are included.

  7. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I.

    1997-01-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three- dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements.

  8. Filtration theory using computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Corey, I. [Lawrence Livermore National Lab., CA (United States)

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  9. Atlas-Based Automatic Generation of Subject-Specific Finite Element Tongue Meshes.

    Science.gov (United States)

    Bijar, Ahmad; Rohan, Pierre-Yves; Perrier, Pascal; Payan, Yohan

    2016-01-01

    Generation of subject-specific 3D finite element (FE) models requires the processing of numerous medical images in order to precisely extract geometrical information about subject-specific anatomy. This processing remains extremely challenging. To overcome this difficulty, we present an automatic atlas-based method that generates subject-specific FE meshes via a 3D registration guided by Magnetic Resonance images. The method extracts a 3D transformation by registering the atlas' volume image to the subject's one, and establishes a one-to-one correspondence between the two volumes. The 3D transformation field deforms the atlas' mesh to generate the subject-specific FE mesh. To preserve the quality of the subject-specific mesh, a diffeomorphic non-rigid registration based on B-spline free-form deformations is used, which guarantees a non-folding and one-to-one transformation. Two evaluations of the method are provided. First, a publicly available CT-database is used to assess the capability to accurately capture the complexity of each subject-specific Lung's geometry. Second, FE tongue meshes are generated for two healthy volunteers and two patients suffering from tongue cancer using MR images. It is shown that the method generates an appropriate representation of the subject-specific geometry while preserving the quality of the FE meshes for subsequent FE analysis. To demonstrate the importance of our method in a clinical context, a subject-specific mesh is used to simulate tongue's biomechanical response to the activation of an important tongue muscle, before and after cancer surgery.

  10. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  11. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  12. Changes in Predicted Muscle Coordination with Subject-Specific Muscle Parameters for Individuals after Stroke

    Directory of Open Access Journals (Sweden)

    Brian A. Knarr

    2014-01-01

    Full Text Available Muscle weakness is commonly seen in individuals after stroke, characterized by lower forces during a maximal volitional contraction. Accurate quantification of muscle weakness is paramount when evaluating individual performance and response to after stroke rehabilitation. The objective of this study was to examine the effect of subject-specific muscle force and activation deficits on predicted muscle coordination when using musculoskeletal models for individuals after stroke. Maximum force generating ability and central activation ratio of the paretic plantar flexors, dorsiflexors, and quadriceps muscle groups were obtained using burst superimposition for four individuals after stroke with a range of walking speeds. Two models were created per subject: one with generic and one with subject-specific activation and maximum isometric force parameters. The inclusion of subject-specific muscle data resulted in changes in the model-predicted muscle forces and activations which agree with previously reported compensation patterns and match more closely the timing of electromyography for the plantar flexor and hamstring muscles. This was the first study to create musculoskeletal simulations of individuals after stroke with subject-specific muscle force and activation data. The results of this study suggest that subject-specific muscle force and activation data enhance the ability of musculoskeletal simulations to accurately predict muscle coordination in individuals after stroke.

  13. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    This is supposed to recall gambling and hence the name Monte Carlo simulation. The procedure was developed by. Stanislaw Ulam and John Van Neumann. They used the simu- lation method to solve partial differential equations for diffu- sion of neutrons! (Box 2). We can illustrate the MC method by a simple example.

  14. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  15. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  16. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  17. Computer Systems/Database Simulation.

    Science.gov (United States)

    1978-10-15

    defined distribution of inter-arrival times. Hence the process of model building and execution is considerably eased with the help of simulation langauges ...the hands of only the data creater need not be forwarded to the data user. This removes both JCL and format diffi-- culties from the users domain . 3...emulators avail- iblte on any machine for most source langauges .) Lower level languages, such as Assembler or Macro-like code will always be machine

  18. Computer simulations applied in materials

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This workshop takes stock of the simulation methods applied to nuclear materials and discusses the conditions in which these methods can predict physical results when no experimental data are available. The main topic concerns the radiation effects in oxides and includes also the behaviour of fission products in ceramics, the diffusion and segregation phenomena and the thermodynamical properties under irradiation. This document brings together a report of the previous 2002 workshop and the transparencies of 12 presentations among the 15 given at the workshop: accommodation of uranium and plutonium in pyrochlores; radiation effects in La{sub 2}Zr{sub 2}O{sub 7} pyrochlores; first principle calculations of defects formation energies in the Y{sub 2}(Ti,Sn,Zr){sub 2}O{sub 7} pyrochlore system; an approximate approach to predicting radiation tolerant materials; molecular dynamics study of the structural effects of displacement cascades in UO{sub 2}; composition defect maps for A{sup 3+}B{sup 3+}O{sub 3} perovskites; NMR characterization of radiation damaged materials: using simulation to interpret the data; local structure in damaged zircon: a first principle study; simulation studies on SiC; insertion and diffusion of He in 3C-SiC; a review of helium in silica; self-trapped holes in amorphous silicon dioxide: their short-range structure revealed from electron spin resonance and optical measurements and opportunities for inferring intermediate range structure by theoretical modelling. (J.S.)

  19. Bayesian longitudinal segmentation of hippocampal substructures in brain MRI using subject-specific atlases

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Van Leemput, Koen; Augustinack, Jean

    2016-01-01

    images and computational atlases, automatic segmentation of hippocampal subregions is becoming feasible in MRI scans. Here we introduce a generative model for dedicated longitudinal segmentation that relies on subject-specific atlases. The segmentations of the scans at the different time points...

  20. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  1. Computational Modeling of Simulation Tests.

    Science.gov (United States)

    1980-06-01

    cavity was simulated with a nonrigid, partially reflecting heavy gas (the rigid wall of 905.0021 was replaced with additional cells of ideal gas which...the shock tunnel at the 4.14-Mpa range found in calculation 906.1081. The driver consisted of 25 cells of burned ammonium nitrate and fuel oil ( ANFO ...mm AX = 250 mm Reflected Wave Geometry--Calculation 906.1091 65 m Driver Region Reaction Region Boundary Burned Rigid ANFO Real Air Reflecting k 90.6

  2. Computer Code for Nanostructure Simulation

    Science.gov (United States)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  3. Flow simulation and high performance computing

    Science.gov (United States)

    Tezduyar, T.; Aliabadi, S.; Behr, M.; Johnson, A.; Kalro, V.; Litke, M.

    1996-10-01

    Flow simulation is a computational tool for exploring science and technology involving flow applications. It can provide cost-effective alternatives or complements to laboratory experiments, field tests and prototyping. Flow simulation relies heavily on high performance computing (HPC). We view HPC as having two major components. One is advanced algorithms capable of accurately simulating complex, real-world problems. The other is advanced computer hardware and networking with sufficient power, memory and bandwidth to execute those simulations. While HPC enables flow simulation, flow simulation motivates development of novel HPC techniques. This paper focuses on demonstrating that flow simulation has come a long way and is being applied to many complex, real-world problems in different fields of engineering and applied sciences, particularly in aerospace engineering and applied fluid mechanics. Flow simulation has come a long way because HPC has come a long way. This paper also provides a brief review of some of the recently-developed HPC methods and tools that has played a major role in bringing flow simulation where it is today. A number of 3D flow simulations are presented in this paper as examples of the level of computational capability reached with recent HPC methods and hardware. These examples are, flow around a fighter aircraft, flow around two trains passing in a tunnel, large ram-air parachutes, flow over hydraulic structures, contaminant dispersion in a model subway station, airflow past an automobile, multiple spheres falling in a liquid-filled tube, and dynamics of a paratrooper jumping from a cargo aircraft.

  4. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  5. Computer Simulations of Lipid Nanoparticles

    Directory of Open Access Journals (Sweden)

    Xavier F. Fernandez-Luengo

    2017-12-01

    Full Text Available Lipid nanoparticles (LNP are promising soft matter nanomaterials for drug delivery applications. In spite of their interest, little is known about the supramolecular organization of the components of these self-assembled nanoparticles. Here, we present a molecular dynamics simulation study, employing the Martini coarse-grain forcefield, of self-assembled LNPs made by tripalmitin lipid in water. We also study the adsorption of Tween 20 surfactant as a protective layer on top of the LNP. We show that, at 310 K (the temperature of interest in biological applications, the structure of the lipid nanoparticles is similar to that of a liquid droplet, in which the lipids show no nanostructuration and have high mobility. We show that, for large enough nanoparticles, the hydrophilic headgroups develop an interior surface in the NP core that stores liquid water. The surfactant is shown to organize in an inhomogeneous way at the LNP surface, with patches with high surfactant concentrations and surface patches not covered by surfactant.

  6. Enabling Computational Technologies for Terascale Scientific Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  7. Electric Propulsion Plume Simulations Using Parallel Computer

    Directory of Open Access Journals (Sweden)

    Joseph Wang

    2007-01-01

    Full Text Available A parallel, three-dimensional electrostatic PIC code is developed for large-scale electric propulsion simulations using parallel supercomputers. This code uses a newly developed immersed-finite-element particle-in-cell (IFE-PIC algorithm designed to handle complex boundary conditions accurately while maintaining the computational speed of the standard PIC code. Domain decomposition is used in both field solve and particle push to divide the computation among processors. Two simulations studies are presented to demonstrate the capability of the code. The first is a full particle simulation of near-thruster plume using real ion to electron mass ratio. The second is a high-resolution simulation of multiple ion thruster plume interactions for a realistic spacecraft using a domain enclosing the entire solar array panel. Performance benchmarks show that the IFE-PIC achieves a high parallel efficiency of ≥ 90%

  8. Time reversibility, computer simulation, and chaos

    CERN Document Server

    Hoover, William Graham

    1999-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful

  9. Perspective: Computer simulations of long time dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Elber, Ron [Department of Chemistry, The Institute for Computational Engineering and Sciences, University of Texas at Austin, Austin, Texas 78712 (United States)

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

  10. All Roads Lead to Computing: Making, Participatory Simulations, and Social Computing as Pathways to Computer Science

    Science.gov (United States)

    Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri

    2017-01-01

    Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…

  11. Quantitative computer simulations of extraterrestrial processing operations

    Science.gov (United States)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  12. Teaching university lecturers how to teach subject:specific writing

    OpenAIRE

    Manderstedt, Lena; Palo, Annbritt

    2015-01-01

    Teaching university lecturers how to teach subject-specific writingLena Manderstedt, Annbritt PaloStandards of student literacy are falling, due to an increased number of students described as non-traditional entrants not knowing how to write (Lea & Street, 1998). Extensive research into academic literacy practices has been carried out, including genre pedagogy (Martin, 2009), the effectiveness of feedback (Hattie & Timperley, 2007) and the role of assessment as a key to develop and i...

  13. EEG topographies provide subject-specific correlates of motor control.

    Science.gov (United States)

    Pirondini, Elvira; Coscia, Martina; Minguillon, Jesus; Millán, José Del R; Van De Ville, Dimitri; Micera, Silvestro

    2017-10-16

    Electroencephalography (EEG) of brain activity can be represented in terms of dynamically changing topographies (microstates). Notably, spontaneous brain activity recorded at rest can be characterized by four distinctive topographies. Despite their well-established role during resting state, their implication in the generation of motor behavior is debated. Evidence of such a functional role of spontaneous brain activity would provide support for the design of novel and sensitive biomarkers in neurological disorders. Here we examined whether and to what extent intrinsic brain activity contributes and plays a functional role during natural motor behaviors. For this we first extracted subject-specific EEG microstates and muscle synergies during reaching-and-grasping movements in healthy volunteers. We show that, in every subject, well-known resting-state microstates persist during movement execution with similar topographies and temporal characteristics, but are supplemented by novel task-related microstates. We then show that the subject-specific microstates' dynamical organization correlates with the activation of muscle synergies and can be used to decode individual grasping movements with high accuracy. These findings provide first evidence that spontaneous brain activity encodes detailed information about motor control, offering as such the prospect of a novel tool for the definition of subject-specific biomarkers of brain plasticity and recovery in neuro-motor disorders.

  14. Computer simulation of proton channelling in silicon

    Indian Academy of Sciences (India)

    2000-06-12

    Jun 12, 2000 ... Computer simulation of proton channelling in silicon. N K DEEPAK, K RAJASEKHARAN* and K NEELAKANDAN. Department of Physics, University of Calicut, Malappuram 673 635, India. *. Department of Physics, Malabar Christian College, Kozhikode 673 001, India. MS received 11 October 1999; revised ...

  15. Computer simulations of phospholipid - membrane thermodynamic fluctuations

    DEFF Research Database (Denmark)

    Pedersen, U.R.; Peters, Günther H.j.; Schröder, T.B.

    2008-01-01

    This paper reports all-atom computer simulations of five phospholipid membranes, DMPC, DPPC, DMPG, DMPS, and DMPSH, with a focus on the thermal equilibrium fluctuations of volume, energy, area, thickness, and order parameter. For the slow fluctuations at constant temperature and pressure (defined...

  16. Spiking network simulation code for petascale computers

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  17. Spiking network simulation code for petascale computers.

    Science.gov (United States)

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  18. A parallel computational model for GATE simulations.

    Science.gov (United States)

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. High-performance computing MRI simulations.

    Science.gov (United States)

    Stöcker, Tony; Vahedipour, Kaveh; Pflugfelder, Daniel; Shah, N Jon

    2010-07-01

    A new open-source software project is presented, JEMRIS, the Jülich Extensible MRI Simulator, which provides an MRI sequence development and simulation environment for the MRI community. The development was driven by the desire to achieve generality of simulated three-dimensional MRI experiments reflecting modern MRI systems hardware. The accompanying computational burden is overcome by means of parallel computing. Many aspects are covered that have not hitherto been simultaneously investigated in general MRI simulations such as parallel transmit and receive, important off-resonance effects, nonlinear gradients, and arbitrary spatiotemporal parameter variations at different levels. The latter can be used to simulate various types of motion, for instance. The JEMRIS user interface is very simple to use, but nevertheless it presents few limitations. MRI sequences with arbitrary waveforms and complex interdependent modules are modeled in a graphical user interface-based environment requiring no further programming. This manuscript describes the concepts, methods, and performance of the software. Examples of novel simulation results in active fields of MRI research are given. (c) 2010 Wiley-Liss, Inc.

  20. Fluid Dynamics Theory, Computation, and Numerical Simulation

    CERN Document Server

    Pozrikidis, Constantine

    2009-01-01

    Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...

  1. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  2. Computational Challenges in Nuclear Weapons Simulation

    Energy Technology Data Exchange (ETDEWEB)

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  3. Computer Simulation for Emergency Incident Management

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  4. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  5. Computer simulation of multiple dynamic photorefractive gratings

    DEFF Research Database (Denmark)

    Buchhave, Preben

    1998-01-01

    The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data....... The simulation sheds light on issues that are not amenable to analytical solutions, such as the spectral content of the wave forms, cross talk in three-beam interaction, and the range of applications of the band-transport model. (C) 1998 Optical Society of America....

  6. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  7. Computer simulation of molecular sorption in zeolites

    CERN Document Server

    Calmiano, M D

    2001-01-01

    The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computat...

  8. Understanding membrane fouling mechanisms through computational simulations

    Science.gov (United States)

    Xiang, Yuan

    This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

  9. Computational plasticity algorithm for particle dynamics simulations

    Science.gov (United States)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2018-01-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  10. Computer simulations and theory of protein translocation.

    Science.gov (United States)

    Makarov, Dmitrii E

    2009-02-17

    The translocation of proteins through pores is central to many biological phenomena, such as mitochondrial protein import, protein degradation, and delivery of protein toxins to their cytosolic targets. Because proteins typically have to pass through constrictions that are too narrow to accommodate folded structures, translocation must be coupled to protein unfolding. The simplest model that accounts for such co-translocational unfolding assumes that both translocation and unfolding are accomplished by pulling on the end of the polypeptide chain mechanically. In this Account, we describe theoretical studies and computer simulations of this model and discuss how the time scales of translocation depend on the pulling force and on the protein structure. Computationally, this is a difficult problem because biologically or experimentally relevant time scales of translocation are typically orders of magnitude slower than those accessible by fully atomistic simulations. For this reason, we explore one-dimensional free energy landscapes along suitably defined translocation coordinates and discuss various approaches to their computation. We argue that the free energy landscape of translocation is often bumpy because confinement partitions the protein's configuration space into distinct basins of attraction separated by large entropic barriers. Favorable protein-pore interactions and nonnative interactions within the protein further contribute to the complexity. Computer simulations and simple scaling estimates show that forces of just 2-6 pN are often sufficient to ensure transport of unstructured polypeptides, whereas much higher forces are typically needed to translocate folded protein domains. The unfolding mechanisms found from simulations of translocation are different from those observed in the much better understood case of atomic force microscopy (AFM) pulling studies, in which proteins are unraveled by stretching them between their N- and C-termini. In contrast to

  11. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2012-11-01

    Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  12. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  13. A Subject-Specificity Analysis of Radio Channels in Wireless Body Area Networks

    Directory of Open Access Journals (Sweden)

    Yang Hao

    2011-07-01

    Full Text Available This paper presents an analysis of subject-specific radio channels in wireless body area networks (WBANs using a simulation tool based on the parallel finite-difference time-domain (FDTD technique. This technique is well suited to model radio propagations around complex, inhomogeneous objects such as the human body. The impacts of different subjects varying in size on on-body, inter-body, and off-body radio channels are studied. The analysis demonstrates that the characteristics of on-body radio channels are subject-specific and are associated with human gender, height, and body mass index. On the other hand, when waves propagate away from the body, such as in the inter-body and off-body cases, the impacts of different subjects on the channel characteristics are found to be negligible.

  14. Computer simulation of electrokinetics in colloidal systems

    Science.gov (United States)

    Schmitz, R.; Starchenko, V.; Dünweg, B.

    2013-11-01

    The contribution gives a brief overview outlining how our theoretical understanding of the phenomenon of colloidal electrophoresis has improved over the decades. Particular emphasis is put on numerical calculations and computer simulation models, which have become more and more important as the level of description became more detailed and refined. Due to computational limitations, it has so far not been possible to study "perfect" models. Different complementary models have hence been developed, and their various strengths and deficiencies are briefly discussed. This is contrasted with the experimental situation, where there are still observations waiting for theoretical explanation. The contribution then outlines our recent development of a numerical method to solve the electrokinetic equations for a finite volume in three dimensions, and describes some new results that could be obtained by the approach.

  15. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  16. Computer Simulations of Intrinsically Disordered Proteins

    Science.gov (United States)

    Chong, Song-Ho; Chatterjee, Prathit; Ham, Sihyun

    2017-05-01

    The investigation of intrinsically disordered proteins (IDPs) is a new frontier in structural and molecular biology that requires a new paradigm to connect structural disorder to function. Molecular dynamics simulations and statistical thermodynamics potentially offer ideal tools for atomic-level characterizations and thermodynamic descriptions of this fascinating class of proteins that will complement experimental studies. However, IDPs display sensitivity to inaccuracies in the underlying molecular mechanics force fields. Thus, achieving an accurate structural characterization of IDPs via simulations is a challenge. It is also daunting to perform a configuration-space integration over heterogeneous structural ensembles sampled by IDPs to extract, in particular, protein configurational entropy. In this review, we summarize recent efforts devoted to the development of force fields and the critical evaluations of their performance when applied to IDPs. We also survey recent advances in computational methods for protein configurational entropy that aim to provide a thermodynamic link between structural disorder and protein activity.

  17. Multiscale Computer Simulation of Failure in Aerogels

    Science.gov (United States)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  18. Computer simulation of arcuate keratotomy for astigmatism.

    Science.gov (United States)

    Hanna, K D; Jouve, F E; Waring, G O; Ciarlet, P G

    1992-01-01

    The development of refractive corneal surgery involves numerous attempts to isolate the effect of individual factors on surgical outcome. Computer simulation of refractive keratotomy allows the surgeon to alter variables of the technique and to isolate the effect of specific factors independent of other factors, something that cannot easily be done in any of the currently available experimental models. We used the finite element numerical method to construct a mathematical model of the eye. The model analyzed stress-strain relationships in the normal corneoscleral shell and after astigmatic surgery. The model made the following assumptions: an axisymmetric eye, an idealized aspheric anterior corneal surface, transversal isotropy of the cornea, nonlinear strain tensor for large displacements, and near incompressibility of the corneoscleral shell. The eye was assumed to be fixed at the level of the optic nerve. The model described the acute elastic response of the eye to corneal surgery. We analyzed the effect of paired transverse arcuate corneal incisions for the correction of astigmatism. We evaluated the following incision variables and their effect on change in curvature of the incised and unincised meridians: length (longer, more steepening of unincised meridian), distance from the center of the cornea (farther, less flattening of incised meridian), depth (deeper, more effect), and the initial amount of astigmatism (small effect). Our finite element computer model gives reasonably accurate information about the relative effects of different surgical variables, and demonstrates the feasibility of using nonlinear, anisotropic assumptions in the construction of such a computer model. Comparison of these computer-generated results to clinically achieved results may help refine the computer model.

  19. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  20. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  1. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  2. Are Autonomous and Controlled Motivations School-Subjects-Specific?

    Directory of Open Access Journals (Sweden)

    Julien Chanal

    Full Text Available This research sought to test whether autonomous and controlled motivations are specific to school subjects or more general to the school context. In two cross-sectional studies, 252 elementary school children (43.7% male; mean age = 10.7 years, SD = 1.3 years and 334 junior high school children (49.7% male, mean age = 14.07 years, SD = 1.01 years were administered a questionnaire assessing their motivation for various school subjects. Results based on structural equation modeling using the correlated trait-correlated method minus one model (CTCM-1 showed that autonomous and controlled motivations assessed at the school subject level are not equally school-subject-specific. We found larger specificity effects for autonomous (intrinsic and identified than for controlled (introjected and external motivation. In both studies, results of factor loadings and the correlations with self-concept and achievement demonstrated that more evidence of specificity was obtained for autonomous regulations than for controlled ones. These findings suggest a new understanding of the hierarchical and multidimensional academic structure of autonomous and controlled motivations and of the mechanisms involved in the development of types of regulations for school subjects.

  3. Are Autonomous and Controlled Motivations School-Subjects-Specific?

    Science.gov (United States)

    Chanal, Julien; Guay, Frédéric

    2015-01-01

    This research sought to test whether autonomous and controlled motivations are specific to school subjects or more general to the school context. In two cross-sectional studies, 252 elementary school children (43.7% male; mean age = 10.7 years, SD = 1.3 years) and 334 junior high school children (49.7% male, mean age = 14.07 years, SD = 1.01 years) were administered a questionnaire assessing their motivation for various school subjects. Results based on structural equation modeling using the correlated trait-correlated method minus one model (CTCM-1) showed that autonomous and controlled motivations assessed at the school subject level are not equally school-subject-specific. We found larger specificity effects for autonomous (intrinsic and identified) than for controlled (introjected and external) motivation. In both studies, results of factor loadings and the correlations with self-concept and achievement demonstrated that more evidence of specificity was obtained for autonomous regulations than for controlled ones. These findings suggest a new understanding of the hierarchical and multidimensional academic structure of autonomous and controlled motivations and of the mechanisms involved in the development of types of regulations for school subjects.

  4. Are Autonomous and Controlled Motivations School-Subjects-Specific?

    Science.gov (United States)

    Chanal, Julien; Guay, Frédéric

    2015-01-01

    This research sought to test whether autonomous and controlled motivations are specific to school subjects or more general to the school context. In two cross-sectional studies, 252 elementary school children (43.7% male; mean age = 10.7 years, SD = 1.3 years) and 334 junior high school children (49.7% male, mean age = 14.07 years, SD = 1.01 years) were administered a questionnaire assessing their motivation for various school subjects. Results based on structural equation modeling using the correlated trait-correlated method minus one model (CTCM-1) showed that autonomous and controlled motivations assessed at the school subject level are not equally school-subject-specific. We found larger specificity effects for autonomous (intrinsic and identified) than for controlled (introjected and external) motivation. In both studies, results of factor loadings and the correlations with self-concept and achievement demonstrated that more evidence of specificity was obtained for autonomous regulations than for controlled ones. These findings suggest a new understanding of the hierarchical and multidimensional academic structure of autonomous and controlled motivations and of the mechanisms involved in the development of types of regulations for school subjects. PMID:26247788

  5. Modelling of subject specific based segmental dynamics of knee joint

    Science.gov (United States)

    Nasir, N. H. M.; Ibrahim, B. S. K. K.; Huq, M. S.; Ahmad, M. K. I.

    2017-09-01

    This study determines segmental dynamics parameters based on subject specific method. Five hemiplegic patients participated in the study, two men and three women. Their ages ranged from 50 to 60 years, weights from 60 to 70 kg and heights from 145 to 170 cm. Sample group included patients with different side of stroke. The parameters of the segmental dynamics resembling the knee joint functions measured via measurement of Winter and its model generated via the employment Kane's equation of motion. Inertial parameters in the form of the anthropometry can be identified and measured by employing Standard Human Dimension on the subjects who are in hemiplegia condition. The inertial parameters are the location of centre of mass (COM) at the length of the limb segment, inertia moment around the COM and masses of shank and foot to generate accurate motion equations. This investigation has also managed to dig out a few advantages of employing the table of anthropometry in movement biomechanics of Winter's and Kane's equation of motion. A general procedure is presented to yield accurate measurement of estimation for the inertial parameters for the joint of the knee of certain subjects with stroke history.

  6. Computer Simulation of Developmental Processes and ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  7. Computational Modeling and Simulation of Developmental ...

    Science.gov (United States)

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  8. Computer simulation of fatigue under diametrical compression.

    Science.gov (United States)

    Carmona, H A; Kun, F; Andrade, J S; Herrmann, H J

    2007-04-01

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  9. Investigating European genetic history through computer simulations.

    Science.gov (United States)

    Currat, Mathias; Silva, Nuno M

    2013-01-01

    The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.

  10. Computer Simulation of the UMER Gridded Gun

    CERN Document Server

    Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun

    2005-01-01

    The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...

  11. Symplectic molecular dynamics simulations on specially designed parallel computers.

    Science.gov (United States)

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  12. Are We Sims? How Computer Simulations Represent and What This Means for the Simulation Argument

    OpenAIRE

    Beisbart, Claus

    2017-01-01

    N. Bostrom's simulation argument and two additional assumptions imply that we likely live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and materia...

  13. Validation of subject-specific automated p-FE analysis of the proximal femur.

    Science.gov (United States)

    Trabelsi, Nir; Yosibash, Zohar; Milgrom, Charles

    2009-02-09

    The use of subject-specific finite element (FE) models in clinical practice requires a high level of automation and validation. In Yosibash et al. [2007a. Reliable simulations of the human proximal femur by high-order finite element analysis validated by experimental observations. J. Biomechanics 40, 3688-3699] a novel method for generating high-order finite element (p-FE) models from CT scans was presented and validated by experimental observations on two fresh frozen femurs (harvested from a 30 year old male and 21 year old female). Herein, we substantiate the validation process by enlarging the experimental database (54 year old female femur), improving the method and examine its robustness under different CT scan conditions. A fresh frozen femur of a 54 year old female was scanned under two different environments: in air and immersed in water (dry and wet CT). Thereafter, the proximal femur was quasi-statically loaded in vitro by a 1000N load. The two QCT scans were manipulated to generate p-FE models that mimic the experimental conditions. We compared p-FE displacements and strains of the wet CT model to the dry CT model and to the experimental results. In addition, the material assignment strategy was reinvestigated. The inhomogeneous Young's modulus was represented in the FE model using two different methods, directly extracted from the CT data and using continuous spatial functions as in Yosibash et al. [2007a. Reliable simulations of the human proximal femur by high-order finite element analysis validated by experimental observations. J. Biomechanics 40, 3688-3699]. Excellent agreement between dry and wet FE models was found for both displacements and strains, i.e. the method is insensitive to CT conditions and may be used in vivo. Good agreement was also found between FE results and experimental observations. The spatial functions representing Young's modulus are local and do not influence strains and displacements prediction. Finally, the p-FE results of

  14. Computer Simulation of Electron Positron Annihilation Processes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, y

    2003-10-02

    With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfaces between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create

  15. Computer simulation of amorphous MIS solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Shousha, A.H.M.; El-Kosheiry, M.A. [Cairo University (Egypt). Electronics and Communications Engineering Dept.

    1997-10-01

    A computer model to simulate amorphous MIS solar cells is developed. The model is based on the self-consistent solution of the electron and hole continuity equations, together with the Poisson equation under proper boundary conditions. The program developed is used to investigate the cell performance characteristics in terms of its physical and structural parameters. The current-voltage characteristics of the solar cell are obtained under AMI solar illumination. The dependences of the short-circuit current, open-circuit voltage, fill factor and cell conversion efficiency on localized gap state density, carrier lifetime, cell thickness and surface recombination velocity are obtained and discussed. The results presented show how cell parameters can be varied to improve the cell performance characteristics. (Author)

  16. Can subject-specific single-fibre electrically evoked auditory brainstem response data be predicted from a model?

    Science.gov (United States)

    Malherbe, Tiaan K; Hanekom, Tania; Hanekom, Johan J

    2013-07-01

    This article investigates whether prediction of subject-specific physiological data is viable through an individualised computational model of a cochlear implant. Subject-specific predictions could be particularly useful to assess and quantify the peripheral factors that cause inter-subject variations in perception. The results of such model predictions could potentially be translated to clinical application through optimisation of mapping parameters for individual users, since parameters that affect perception would be reflected in the model structure and parameters. A method to create a subject-specific computational model of a guinea pig with a cochlear implant is presented. The objectives of the study are to develop a method to construct subject-specific models considering translation of the method to in vivo human models and to assess the effectiveness of subject-specific models to predict peripheral neural excitation on subject level. Neural excitation patterns predicted by the model are compared with single-fibre electrically evoked auditory brainstem responses obtained from the inferior colliculus in the same animal. Results indicate that the model can predict threshold frequency location, spatial spread of bipolar and tripolar stimulation and electrode thresholds relative to one another where electrodes are located in different cochlear structures. Absolute thresholds and spatial spread using monopolar stimulation are not predicted accurately. Improvements to the model should address this. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  18. Development and validation of a subject-specific finite element model of the functional spinal unit to predict vertebral strength.

    Science.gov (United States)

    Lee, Chu-Hee; Landham, Priyan R; Eastell, Richard; Adams, Michael A; Dolan, Patricia; Yang, Lang

    2017-09-01

    Finite element models of an isolated vertebral body cannot accurately predict compressive strength of the spinal column because, in life, compressive load is variably distributed across the vertebral body and neural arch. The purpose of this study was to develop and validate a patient-specific finite element model of a functional spinal unit, and then use the model to predict vertebral strength from medical images. A total of 16 cadaveric functional spinal units were scanned and then tested mechanically in bending and compression to generate a vertebral wedge fracture. Before testing, an image processing and finite element analysis framework (SpineVox-Pro), developed previously in MATLAB using ANSYS APDL, was used to generate a subject-specific finite element model with eight-node hexahedral elements. Transversely isotropic linear-elastic material properties were assigned to vertebrae, and simple homogeneous linear-elastic properties were assigned to the intervertebral disc. Forward bending loading conditions were applied to simulate manual handling. Results showed that vertebral strengths measured by experiment were positively correlated with strengths predicted by the functional spinal unit finite element model with von Mises or Drucker-Prager failure criteria ( R 2  = 0.80-0.87), with areal bone mineral density measured by dual-energy X-ray absorptiometry ( R 2  = 0.54) and with volumetric bone mineral density from quantitative computed tomography ( R 2  = 0.79). Large-displacement non-linear analyses on all specimens did not improve predictions. We conclude that subject-specific finite element models of a functional spinal unit have potential to estimate the vertebral strength better than bone mineral density alone.

  19. Towards A Novel Environment For Simulation Of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Joanna Patrzyk

    2015-01-01

    Full Text Available In this paper we analyze existing quantum computer simulation techniquesand their realizations to minimize the impact of the exponentialcomplexity of simulated quantum computations. As a result of thisinvestigation, we propose a quantum computer simulator with an integrateddevelopment environment - QuIDE - supporting development of algorithms forfuture quantum computers. The simulator simplifies building and testingquantum circuits and understand quantum algorithms in an efficient way.The development environment provides  flexibility of source codeedition and ease of graphical building of circuit diagrams.  We alsodescribe and analyze the complexity of algorithms used for simulationand present performance results of the simulator as well as results ofits deployment during university classes.

  20. Associative Memory Computing Power and Its Simulation

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  1. Associative Memory computing power and its simulation

    CERN Document Server

    Ancu, L S; The ATLAS collaboration; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2014-01-01

    The associative memory (AM) system is a computing device made of hundreds of AM ASICs chips designed to perform “pattern matching” at very high speed. Since each AM chip stores a data base of 130000 pre-calculated patterns and large numbers of chips can be easily assembled together, it is possible to produce huge AM banks. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS Fast TracKer (FTK) Processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 micro seconds. The simulation of such a parallelized system is an extremely complex task if executed in commercial computers based on normal CPUs. The algorithm performance is limited, due to the lack of parallelism, and in addition the memory requirement is very large. In fact the AM chip uses a content addressable memory (CAM) architecture. Any data inquiry is broadcast to all memory elements simultaneously, thus data retrieval time is independent of the database size. The gr...

  2. Computer simulations of the mouse spermatogenic cycle

    Directory of Open Access Journals (Sweden)

    Debjit Ray

    2014-12-01

    Full Text Available The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal–spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

  3. Computer-aided Instructional System for Transmission Line Simulation.

    Science.gov (United States)

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  4. [Thoughts on and probes into computer simulation of acupuncture manipulation].

    Science.gov (United States)

    Hu, Yin'e; Liu, Tangyi; Tang, Wenchao; Xu, Gang; Gao, Ming; Yang, Huayuan

    2011-08-01

    The studies of the simulation of acupuncture manipulation mainly focus on mechanical simulation and virtual simulation (SIM). In terms of mechanical simulation, the aim of the research is to develop the instruments of the simulation of acupuncture manipulation, and to apply them to the simulation or a replacement of the manual acupuncture manipulation; while the virtual simulation applies the virtual reality technology to present the manipulation in 3D real-time on the computer screen. This paper is to summarize the recent research progress on computer simulation of acupuncture manipulation at home and abroad, and thus concludes with the significance and the rising problems over the computer simulation of acupuncture manipulation. Therefore we put forward that the research on simulation manipulation should pay much attention to experts' manipulation simulation, as well as the verification studies on conformity and clinical effects.

  5. Using Computational Simulations to Confront Students' Mental Models

    Science.gov (United States)

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  6. Traffic Simulations on Parallel Computers Using Domain Decomposition Techniques

    Science.gov (United States)

    1995-01-01

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be acheived by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic...

  7. Supporting hypothesis generation by learners exploring an interactive computer simulation

    NARCIS (Netherlands)

    van Joolingen, Wouter; de Jong, Anthonius J.M.

    1992-01-01

    Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This

  8. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    Science.gov (United States)

    1994-08-10

    23 Haddock, J. and O’Keefe, R., "Using Artificial Intelligence to Facilitate Manufacturing Systems Simulation," Computers & Industrial Engineering , Vol...Feedforward Neural Networks," Computers & Industrial Engineering , Vol. 21, No. 1- 4, (1991), pp. 247-251. 87 Proceedings of the 1992 Summer Computer...Using Simulation Experiments," Computers & Industrial Engineering , Vol. 22, No. 2 (1992), pp. 195-209. 119 Kuei, C. and Madu, C., "Polynomial

  9. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  10. Computer simulation boosts automation in the stockyard

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-04-01

    Today's desktop computer and advanced software keep pace with handling equipment to reach new heights of sophistication with graphic simulation able to show precisely what is and could happen in the coal terminal's stockyard. The article describes an innovative coal terminal nearing completion on the Pacific coast at Lazaro Cardenas in Mexico, called the Petracalco terminal. Here coal is unloaded, stored and fed to the nearby power plant of Pdte Plutarco Elias Calles. The R & D department of the Italian company Techint, Italimpianti has developed MHATIS, a sophisticated software system for marine terminal management here, allowing analysis of performance with the use of graphical animation. Strategies can be tested before being put into practice and likely power station demand can be predicted. The design and operation of the MHATIS system is explained. Other integrated coal handling plants described in the article are that developed by the then PWH (renamed Krupp Foerdertechnik) of Germany for the Israel Electric Corporation and the installation by the same company of a further bucketwheel for a redesigned coal stockyard at the Port of Hamburg operated by Hansaport. 1 fig., 4 photos.

  11. Computational simulation of liquid rocket injector anomalies

    Science.gov (United States)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  12. Factors promoting engaged exploration with computer simulations

    Directory of Open Access Journals (Sweden)

    Noah S. Podolefsky

    2010-10-01

    Full Text Available This paper extends prior research on student use of computer simulations (sims to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze interviews with college students using PhET sims in order to demonstrate engaged exploration, and to identify factors that can promote this type of inquiry. With minimal explicit guidance, students explore the topic of wave interference in ways that bear similarity to how scientists explore phenomena. PhET sims are flexible tools which allow students to choose their own learning path, but also provide constraints such that students’ choices are generally productive. This type of inquiry is supported by sim features such as concrete connections to the real world, representations that are not available in the real world, analogies to help students make meaning of and connect across multiple representations and phenomena, and a high level of interactivity with real-time, dynamic feedback from the sim. These features of PhET sims enable students to pose questions and answer them in ways that may not be supported by more traditional educational materials.

  13. Explore Effective Use of Computer Simulations for Physics Education

    Science.gov (United States)

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  14. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  15. How Effective Is Instructional Support for Learning with Computer Simulations?

    Science.gov (United States)

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  16. Computers for real time flight simulation: A market survey

    Science.gov (United States)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  17. Challenges & Roadmap for Beyond CMOS Computing Simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Arun F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frank, Michael P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).

  18. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  19. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  20. High performance computing system for flight simulation at NASA Langley

    Science.gov (United States)

    Cleveland, Jeff I., II; Sudik, Steven J.; Grove, Randall D.

    1991-01-01

    The computer architecture and components used in the NASA Langley Advanced Real-Time Simulation System (ARTSS) are briefly described and illustrated with diagrams and graphs. Particular attention is given to the advanced Convex C220 processing units, the UNIX-based operating system, the software interface to the fiber-optic-linked Computer Automated Measurement and Control system, configuration-management and real-time supervisor software, ARTSS hardware modifications, and the current implementation status. Simulation applications considered include the Transport Systems Research Vehicle, the Differential Maneuvering Simulator, the General Aviation Simulator, and the Visual Motion Simulator.

  1. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...

  2. A note on simulated annealing to computer laboratory scheduling ...

    African Journals Online (AJOL)

    The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...

  3. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  4. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    Science.gov (United States)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  5. Symbolic Computations in Simulations of Hydromagnetic Dynamo

    Directory of Open Access Journals (Sweden)

    Vodinchar Gleb

    2017-01-01

    Full Text Available The compilation of spectral models of geophysical fluid dynamics and hydromagnetic dynamo involves the calculation of a large number of volume integrals from complex combinations of basis fields. In this paper we describe the automation of this computation with the help of systems of symbolic computations.

  6. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2017-01-01

    This book provides an accessible introduction to the basic theory of fluid mechanics and computational fluid dynamics (CFD) from a modern perspective that unifies theory and numerical computation. Methods of scientific computing are introduced alongside with theoretical analysis and MATLAB® codes are presented and discussed for a broad range of topics: from interfacial shapes in hydrostatics, to vortex dynamics, to viscous flow, to turbulent flow, to panel methods for flow past airfoils. The third edition includes new topics, additional examples, solved and unsolved problems, and revised images. It adds more computational algorithms and MATLAB programs. It also incorporates discussion of the latest version of the fluid dynamics software library FDLIB, which is freely available online. FDLIB offers an extensive range of computer codes that demonstrate the implementation of elementary and advanced algorithms and provide an invaluable resource for research, teaching, classroom instruction, and self-study. This ...

  7. Application of computer simulated persons in indoor environmental modeling

    DEFF Research Database (Denmark)

    Topp, C.; Nielsen, P. V.; Sørensen, Dan Nørtoft

    2002-01-01

    Computer simulated persons are often applied when the indoor environment is modeled by computational fluid dynamics. The computer simulated persons differ in size, shape, and level of geometrical complexity, ranging from simple box or cylinder shaped heat sources to more humanlike models. Little...... effort, however, has been focused on the influence of the geometry. This work provides an investigation of geometrically different computer simulated persons with respect to both local and global airflow distribution. The results show that a simple geometry is sufficient when the global airflow...... of a ventilated enclosure is considered, as little or no influence of geometry was observed at some distance from the computer simulated person. For local flow conditions, though, a more detailed geometry should be applied in order to assess thermal and atmospheric comfort....

  8. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2017-08-10

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  9. High-resolution computer simulations of EKC.

    Science.gov (United States)

    Breadmore, Michael C; Quirino, Joselito P; Thormann, Wolfgang

    2009-02-01

    The electrophoresis simulation software, GENTRANS, has been modified to include the interaction of analytes with an electrolyte additive to allow the simulation of liquid-phase EKC separations. The modifications account for interaction of weak and strong acid and base analytes with a single weak or strong acid or base background electrolyte additive and can be used to simulate a range of EKC separations with both charged and neutral additives. Simulations of separations of alkylphenyl ketones under real experimental conditions were performed using mobility and interaction constant data obtained from the literature and agreed well with experimental separations. Migration times in fused-silica capillaries and linear polyacrylamide-coated capillaries were within 7% of the experimental values, while peak widths were always narrower than the experimental values, but were still within 50% of those obtained by experiment. Simulations of sweeping were also performed; although migration time agreement was not as good as for simple EKC separations, peak widths were in good agreement, being within 1-50% of the experimental values. All simulations for comparison with experimental data were performed under real experimental conditions using a 47 cm capillary and a voltage of 20 kV and represent the first quantitative attempt at simulating EKC separations with and without sweeping.

  10. Some theoretical issues on computer simulations

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  11. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  12. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  13. Numerical Implementation and Computer Simulation of Tracer ...

    African Journals Online (AJOL)

    , was most dependent on the source definition and the hydraulic conductivity K of the porous medium. The 12000mg/l chloride tracer source was almost completely dispersed within 34 hours. Keywords: Replication, Numerical simulation, ...

  14. Computational Simulation of Droplet Collision Dynamics

    National Research Council Canada - National Science Library

    Law, Chung

    2000-01-01

    ..., and the energy partition among the various modes was identified. By using the molecular dynamics method, bouncing and coalescence were successfully simulated for the first time without the artificial manipulation of the inter-droplet gaseous film...

  15. Computational snow avalanche simulation in forested terrain

    Science.gov (United States)

    Teich, M.; Fischer, J.-T.; Feistl, T.; Bebi, P.; Christen, M.; Grêt-Regamey, A.

    2014-08-01

    Two-dimensional avalanche simulation software operating in three-dimensional terrain is widely used for hazard zoning and engineering to predict runout distances and impact pressures of snow avalanche events. Mountain forests are an effective biological protection measure against avalanches; however, the protective capacity of forests to decelerate or even to stop avalanches that start within forested areas or directly above the treeline is seldom considered in this context. In particular, runout distances of small- to medium-scale avalanches are strongly influenced by the structural conditions of forests in the avalanche path. We present an evaluation and operationalization of a novel detrainment function implemented in the avalanche simulation software RAMMS for avalanche simulation in forested terrain. The new approach accounts for the effect of forests in the avalanche path by detraining mass, which leads to a deceleration and runout shortening of avalanches. The relationship is parameterized by the detrainment coefficient K [kg m-1 s-2] accounting for differing forest characteristics. We varied K when simulating 40 well-documented small- to medium-scale avalanches, which were released in and ran through forests of the Swiss Alps. Analyzing and comparing observed and simulated runout distances statistically revealed values for K suitable to simulate the combined influence of four forest characteristics on avalanche runout: forest type, crown closure, vertical structure and surface cover, for example, values for K were higher for dense spruce and mixed spruce-beech forests compared to open larch forests at the upper treeline. Considering forest structural conditions within avalanche simulations will improve current applications for avalanche simulation tools in mountain forest and natural hazard management.

  16. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Biocellion: accelerating computer simulation of multicellular biological system models

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  18. Computer simulation of on-orbit manned maneuvering unit operations

    Science.gov (United States)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  19. Modelling of dusty plasma properties by computer simulation methods

    Energy Technology Data Exchange (ETDEWEB)

    Baimbetov, F B [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Ramazanov, T S [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Dzhumagulova, K N [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Kadyrsizov, E R [Institute for High Energy Densities of RAS, Izhorskaya 13/19, Moscow 125412 (Russian Federation); Petrov, O F [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan); Gavrikov, A V [IETP, Al Farabi Kazakh National University, 96a, Tole bi St, Almaty 050012 (Kazakhstan)

    2006-04-28

    Computer simulation of dusty plasma properties is performed. The radial distribution functions, the diffusion coefficient are calculated on the basis of the Langevin dynamics. A comparison with the experimental data is made.

  20. Computer Simulation of the Impact of Cigarette Smoking On Humans

    African Journals Online (AJOL)

    2012-12-01

    . In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the ..... Secondary School curriculum in Nigeria. 3. Workshops and seminars should be.

  1. On architectural acoustic design using computer simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2004-01-01

    acoustic design process. The emphasis is put on the first three out of five phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference to the design of Bagsværd Church by Jørn Utzon. The paper...... discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper is that the application of acoustic simulation programs is most beneficial in the last of three phases but an application...... properties prior to the actual construction of a building. With the right tools applied, acoustic design can become an integral part of the architectural design process. The aim of this paper is to investigate the field of application that an acoustic simulation programme can have during an architectural...

  2. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  3. Computer simulation of working stress of heat treated steel specimen

    OpenAIRE

    B. Smoljan; D. Iljkić; S. Smokvina Hanza

    2009-01-01

    Purpose: In this paper, the prediction of working stress of quenched and tempered steel has been done. The working stress was characterized by yield strength and fracture toughness. The method of computer simulation of working stress was applied in workpiece of complex form.Design/methodology/approach: Hardness distribution of quenched and tempered workpiece of complex form was predicted by computer simulation of steel quenching using a finite volume method. The algorithm of estimation of yie...

  4. GATE Monte Carlo simulation in a cloud computing environment

    Science.gov (United States)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  5. A simulator for quantum computer hardware

    NARCIS (Netherlands)

    Michielsen, K.F L; de Raedt, H.A.; De Raedt, K.

    We present new examples of the use of the quantum computer (QC) emulator. For educational purposes we describe the implementation of the CNOT and Toffoli gate, two basic building blocks of a QC, on a three qubit NMR-like QC.

  6. Computer Simulations in the Science Classroom.

    Science.gov (United States)

    Richards, John; And Others

    1992-01-01

    Explorer is an interactive environment based on a constructivist epistemology of learning that integrates animated computer models with analytic capabilities for learning science. The system includes graphs, a spreadsheet, scripting, and interactive tools. Two examples involving the dynamics of colliding objects and electric circuits illustrate…

  7. Combat Simulation Using Breach Computer Language

    Science.gov (United States)

    1979-09-01

    modeling Computer language BREACH Urban warfare MOBA M0UT 20. ABSTRACT (XTantBtua oa rmveram sirfa ff racMsary and Identity by block number... MOBA Environment," Technical Memorandum 20-78, US Array Human Engineering Laboratory, Aberdeen Proving Ground, MD, July 1978 "Symposium on

  8. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  9. Studying Scientific Discovery by Computer Simulation.

    Science.gov (United States)

    1983-03-30

    scientific laws that were induced from data before any theory was available to discover the regularities. To the previous examples, we could add Gregor ...discoveries (excluding those of Mendel and Mendeleev, which we have not simulated) could have been made. The Role of Theory in Law Induction BACON’s

  10. Role of computational efficiency in process simulation

    Directory of Open Access Journals (Sweden)

    Kurt Strand

    1989-07-01

    Full Text Available It is demonstrated how efficient numerical algorithms may be combined to yield a powerful environment for analysing and simulating dynamic systems. The importance of using efficient numerical algorithms is emphasized and demonstrated through examples from the petrochemical industry.

  11. Computer Simulation Studies of Trishomocubane Heptapeptide of ...

    African Journals Online (AJOL)

    As part of an extension on the cage peptide chemistry, the present work involves an assessment of the conformational profile of trishomocubane heptapeptide of the type Ac-Ala3-Tris-Ala3-NHMe using molecular dynamics (MD) simulations. All MD protocols were explored within the framework of a molecular mechanics ...

  12. Bodies Falling with Air Resistance: Computer Simulation.

    Science.gov (United States)

    Vest, Floyd

    1982-01-01

    Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)

  13. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  14. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  15. A computer simulator for development of engineering system design methodologies

    Science.gov (United States)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  16. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available -1 Fifth International Conference on Structural Engineering, Mechanics and Computation, Cape Town South Africa, 2-4 September 2013 Computational fluid dynamics simulations and validation of results M.A. Sitek, M. Cwik, M.A. Gizejowski Warsaw...

  17. Probability: Actual Trials, Computer Simulations, and Mathematical Solutions.

    Science.gov (United States)

    Walton, Karen Doyle; Walton, J. Doyle

    The purpose of this teaching unit is to approach elementary probability problems in three ways. First, actual trials are performed and results are recorded. Second, a simple computer simulation of the problem provided on diskette and written for Apple IIe and IIc computers, is run several times. Finally, the mathematical solution of the problem is…

  18. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    As a result of this, beginners are often at a loss when trying to interact with them. The simulator here proposed therefore is aimed at bridging the gap somewhat, making quantum computer simulation more accessible to novices in the field. Journal of the Nigerian Association of Mathematical Physics Vol. 10 2006: pp. 433- ...

  19. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    Science.gov (United States)

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…

  20. Computational fluid dynamics (CFD) simulation of hot air flow ...

    African Journals Online (AJOL)

    Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...

  1. Deterministic event-based simulation of universal quantum computation

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.

  2. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  3. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  4. Development of a Computer Simulation for a Car Deceleration ...

    African Journals Online (AJOL)

    This is very practical, technical, and it happens every day. In this paper, we studied the factors responsible for this event. Using a computer simulation that is based on a mathematical model, we implemented the simulation of a car braking model and showed how long it takes a car to come to rest while considering certain ...

  5. Computer Based Modelling and Simulation-Modelling and ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 4. Computer Based Modelling and Simulation-Modelling and Simulation with Probability and Throwing Dice. N K Srinivasan. General Article Volume 6 Issue 4 April 2001 pp 69-77 ...

  6. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  7. COMPUTER SIMULATION OF A STIRLING REFRIGERATING MACHINE

    Directory of Open Access Journals (Sweden)

    V.V. Trandafilov

    2015-10-01

    Full Text Available In present numerical research, the mathematical model for precise performance simulation and detailed behavior of Stirling refrigerating machine is considered. The mathematical model for alpha Stirling refrigerating machine with helium as the working fluid will be useful in optimization of these machines mechanical design. Complete non-linear mathematical model of the machine, including thermodynamics of helium, and heat transfer from the walls, as well as heat transfer and gas resistance in the regenerator is developed. Non-dimensional groups are derived, and the mathematical model is numerically solved. Important design parameters are varied and their effect on Stirling refrigerating machine performance determined. The simulation results of Stirling refrigerating machine which include heat transfer and coefficient of performance are presented.

  8. On Architectural Acoustics Design using Computer Simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2004-01-01

    room acoustic simulation programs it is now possible to subjectively analyze and evaluate acoustic properties prior to the actual construction of a facility. With the right tools applied, the acoustic design can become an integrated part of the architectural design process. The aim of the present paper...... is to investigate the field of application an acoustic simulation program can have during an architectural acoustics design process. A case study is carried out in order to represent the iterative working process of an architect. The working process is divided into five phases and represented by typical results...... in each phase ? exemplified by Bagsværd Church by Jørn Utzon - and a description of which information would be beneficial to progress in the work. Among other things the applicability as a tool giving inspiration for finding forms of structures and rooms for an architect compared with an architect without...

  9. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    profile. The pressure profile changes when small molecules partition into the bilayer and it has previously been suggested that such changes may be related to general anesthesia. MD simulations play an important role when studying the possible coupling between general anesthesia and changes...... in the pressure profile since the pressure profile cannot be measured in traditional experiments. Even so, pressure profile calculations from MD simulations are not trivial due to both fundamental and technical issues. We addressed two such issues namely the uniqueness of the pressure profile and the effect......CD belongs to the adonesine triphosphate (ATP) binding cassette (ABC) transporter family that use ATP to drive active transport of a wide variety of compounds across cell membranes. BtuCD accounts for vitamin B12 import into Escherichia coli and is one of the only ABC transporters for which a reliable...

  10. Computer Simulation of Turbulent Reactive Gas Dynamics

    Directory of Open Access Journals (Sweden)

    Bjørn H. Hjertager

    1984-10-01

    Full Text Available A simulation procedure capable of handling transient compressible flows involving combustion is presented. The method uses the velocity components and pressure as primary flow variables. The differential equations governing the flow are discretized by integration over control volumes. The integration is performed by application of up-wind differencing in a staggered grid system. The solution procedure is an extension of the SIMPLE-algorithm accounting for compressibility effects.

  11. Computer simulation of functioning of elements of security systems

    Science.gov (United States)

    Godovykh, A. V.; Stepanov, B. P.; Sheveleva, A. A.

    2017-01-01

    The article is devoted to issues of development of the informational complex for simulation of functioning of the security system elements. The complex is described from the point of view of main objectives, a design concept and an interrelation of main elements. The proposed conception of the computer simulation provides an opportunity to simulate processes of security system work for training security staff during normal and emergency operation.

  12. Simulation of scanning transmission electron microscope images on desktop computers

    Energy Technology Data Exchange (ETDEWEB)

    Dwyer, C., E-mail: christian.dwyer@mcem.monash.edu.au [Monash Centre for Electron Microscopy, Department of Materials Engineering, Monash University, Victoria 3800 (Australia)

    2010-02-15

    Two independent strategies are presented for reducing the computation time of multislice simulations of scanning transmission electron microscope (STEM) images: (1) optimal probe sampling, and (2) the use of desktop graphics processing units. The first strategy is applicable to STEM images generated by elastic and/or inelastic scattering, and requires minimal effort for its implementation. Used together, these two strategies can reduce typical computation times from days to hours, allowing practical simulation of STEM images of general atomic structures on a desktop computer.

  13. Micromechanics-Based Computational Simulation of Ceramic Matrix Composites

    Science.gov (United States)

    Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)

    2003-01-01

    Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.

  14. Computer Simulation of Breast Cancer Screening

    Science.gov (United States)

    2001-07-01

    100 200 300 400 500 600 signals at A and B may be, respectively, written as pixel position ESFA =P+S, (1) 80 60 ESFB = P + Sf2, (2) 40/ where P is the...40 / primary ratio at point A (SPR) may be computed from the -60 digital signal values (among other ways) as: -80,... .. . S=2X( ESFA -ESFB), (3) 0...100 200 300 400 500 600 pixel position P= ESFA -S, (4) FIG. 4. (a) Matched primary-only and primary plus scatter ESFs and (b) the SPR= SIP. (5) resulting

  15. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  16. Computer Models Simulate Fine Particle Dispersion

    Science.gov (United States)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  17. Associative Memory computing power and its simulation.

    CERN Document Server

    Ancu, L S; Britzger, D; Giannetti, P; Howarth, J W; Luongo, C; Pandini, C; Schmitt, S; Volpi, G

    2015-01-01

    An important step in the ATLAS upgrade program is the installation of a tracking processor, the Fast Tracker (FTK), with the goal to identify the tracks generated from charged tracks originated by the LHC 14 TeV proton-proton. The collisions will generate thousands of hits in each layer of the silicon tracker detector and track identification is a very challenging computational problem. At the core of the FTK there is associative memory (AM) system, made with hundreds of AM ASICs chips, specifically designed to allow pattern identification in high density environments at very high speed. This component is able to organize the following steps of the track identification providing a huge computing power for a specific application. The AM system will in fact being able to reconstruct tracks in 10s of microseconds. Within the FTK team there has also been a constant effort to maintain a detailed emulation of the system, to predict the impact of single component features in the final performance and in the ATLAS da...

  18. Computer simulation of vasectomy for wolf control

    Science.gov (United States)

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  19. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  20. Computation simulation of the nonlinear response of suspension bridges

    Energy Technology Data Exchange (ETDEWEB)

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  1. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  2. Traffic simulations on parallel computers using domain decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Tentner, A.M.

    1995-12-31

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  3. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  4. Computer simulations of adsorbed liquid crystal films

    Science.gov (United States)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.

  5. Osmosis : a molecular dynamics computer simulation study

    Science.gov (United States)

    Lion, Thomas

    Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

  6. Teaching Physics (and Some Computation) Using Intentionally Incorrect Simulations

    Science.gov (United States)

    Cox, Anne J.; Junkin, William F.; Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco

    2011-05-01

    Computer simulations are widely used in physics instruction because they can aid student visualization of abstract concepts, they can provide multiple representations of concepts (graphical, trajectories, charts), they can approximate real-world examples, and they can engage students interactively, all of which can enhance student understanding of physics concepts. For these reasons, we create and use simulations to teach physics,1,2 but we also want students to recognize that the simulations are only as good as the physics behind them, so we have developed a series of simulations that are intentionally incorrect, where the task is for students to find and correct the errors.3

  7. Computer simulation tests of optimized neutron powder diffractometer configurations

    Energy Technology Data Exchange (ETDEWEB)

    Cussen, L.D., E-mail: Leo@CussenConsulting.com [Cussen Consulting, 23 Burgundy Drive, Doncaster 3108 (Australia); Lieutenant, K., E-mail: Klaus.Lieutenant@helmholtz-berlin.de [Helmholtz Zentrum Berlin, Hahn-Meitner Platz 1, 14109 Berlin (Germany)

    2016-06-21

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  8. Computational algorithms to simulate the steel continuous casting

    Science.gov (United States)

    Ramírez-López, A.; Soto-Cortés, G.; Palomar-Pardavé, M.; Romero-Romo, M. A.; Aguilar-López, R.

    2010-10-01

    Computational simulation is a very powerful tool to analyze industrial processes to reduce operating risks and improve profits from equipment. The present work describes the development of some computational algorithms based on the numerical method to create a simulator for the continuous casting process, which is the most popular method to produce steel products for metallurgical industries. The kinematics of industrial processing was computationally reproduced using subroutines logically programmed. The cast steel by each strand was calculated using an iterative method nested in the main loop. The process was repeated at each time step (Δ t) to calculate the casting time, simultaneously, the steel billets produced were counted and stored. The subroutines were used for creating a computational representation of a continuous casting plant (CCP) and displaying the simulation of the steel displacement through the CCP. These algorithms have been developed to create a simulator using the programming language C++. Algorithms for computer animation of the continuous casting process were created using a graphical user interface (GUI). Finally, the simulator functionality was shown and validated by comparing with the industrial information of the steel production of three casters.

  9. Computer Simulation for Pain Management Education: A Pilot Study.

    Science.gov (United States)

    Allred, Kelly; Gerardi, Nicole

    2017-10-01

    Effective pain management is an elusive concept in acute care. Inadequate knowledge has been identified as a barrier to providing optimal pain management. This study aimed to determine student perceptions of an interactive computer simulation as a potential method for learning pain management, as a motivator to read and learn more about pain management, preference over traditional lecture, and its potential to change nursing practice. A post-computer simulation survey with a mixed-methods descriptive design was used in this study. A college of nursing in a large metropolitan university in the Southeast United States. A convenience sample of 30 nursing students in a Bachelor of Science nursing program. An interactive computer simulation was developed as a potential alternative method of teaching pain management to nursing students. Increases in educational gain as well as its potential to change practice were explored. Each participant was asked to complete a survey consisting of 10 standard 5-point Likert scale items and 5 open-ended questions. The survey was used to evaluate the students' perception of the simulation, specifically related to educational benefit, preference compared with traditional teaching methods, and perceived potential to change nursing practice. Data provided descriptive statistics for initial evaluation of the computer simulation. The responses on the survey suggest nursing students perceive the computer simulation to be entertaining, fun, educational, occasionally preferred over regular lecture, and with potential to change practice. Preliminary data support the use of computer simulation in educating nursing students about pain management. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  10. Using EDUCache Simulator for the Computer Architecture and Organization Course

    Directory of Open Access Journals (Sweden)

    Sasko Ristov

    2013-07-01

    Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.

  11. Macroevolution simulated with autonomously replicating computer programs.

    Science.gov (United States)

    Yedid, Gabriel; Bell, Graham

    The process of adaptation occurs on two timescales. In the short term, natural selection merely sorts the variation already present in a population, whereas in the longer term genotypes quite different from any that were initially present evolve through the cumulation of new mutations. The first process is described by the mathematical theory of population genetics. However, this theory begins by defining a fixed set of genotypes and cannot provide a satisfactory analysis of the second process because it does not permit any genuinely new type to arise. The evolutionary outcome of selection acting on novel variation arising over long periods is therefore difficult to predict. The classical problem of this kind is whether 'replaying the tape of life' would invariably lead to the familiar organisms of the modern biota. Here we study the long-term behaviour of populations of autonomously replicating computer programs and find that the same type, introduced into the same simple environment, evolves on any given occasion along a unique trajectory towards one of many well-adapted end points.

  12. Associative Memory computing power and its simulation.

    CERN Document Server

    Volpi, G; The ATLAS collaboration

    2014-01-01

    The associative memory (AM) chip is ASIC device specifically designed to perform ``pattern matching'' at very high speed and with parallel access to memory locations. The most extensive use for such device will be the ATLAS Fast Tracker (FTK) processor, where more than 8000 chips will be installed in 128 VME boards, specifically designed for high throughput in order to exploit the chip's features. Each AM chip will store a database of about 130000 pre-calculated patterns, allowing FTK to use about 1 billion patterns for the whole system, with any data inquiry broadcast to all memory elements simultaneously within the same clock cycle (10 ns), thus data retrieval time is independent of the database size. Speed and size of the system are crucial for real-time High Energy Physics applications, such as the ATLAS FTK processor. Using 80 million channels of the ATLAS tracker, FTK finds tracks within 100 $\\mathrm{\\mu s}$. The simulation of such a parallelized system is an extremely complex task when executed in comm...

  13. Coupling Computer-Aided Process Simulation and ...

    Science.gov (United States)

    A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable

  14. SiMon: Simulation Monitor for Computational Astrophysics

    Science.gov (United States)

    Qian, Penny Xuran; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  15. Hybrid annealing: Coupling a quantum simulator to a classical computer

    Science.gov (United States)

    Graß, Tobias; Lewenstein, Maciej

    2017-05-01

    Finding the global minimum in a rugged potential landscape is a computationally hard task, often equivalent to relevant optimization problems. Annealing strategies, either classical or quantum, explore the configuration space by evolving the system under the influence of thermal or quantum fluctuations. The thermal annealing dynamics can rapidly freeze the system into a low-energy configuration, and it can be simulated well on a classical computer, but it easily gets stuck in local minima. Quantum annealing, on the other hand, can be guaranteed to find the true ground state and can be implemented in modern quantum simulators; however, quantum adiabatic schemes become prohibitively slow in the presence of quasidegeneracies. Here, we propose a strategy which combines ideas from simulated annealing and quantum annealing. In such a hybrid algorithm, the outcome of a quantum simulator is processed on a classical device. While the quantum simulator explores the configuration space by repeatedly applying quantum fluctuations and performing projective measurements, the classical computer evaluates each configuration and enforces a lowering of the energy. We have simulated this algorithm for small instances of the random energy model, showing that it potentially outperforms both simulated thermal annealing and adiabatic quantum annealing. It becomes most efficient for problems involving many quasidegenerate ground states.

  16. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  17. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...... statistical power analysis as a way to determine the appropriate number of runs. Two examples are then produced using results from an agent-based model. The reader is then guided through the application of this statistical technique and exposed to its limits and potentials....

  18. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  19. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M&S) users. Performing large-scale, massively...

  20. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  1. Towards accurate quantum simulations of large systems with small computers.

    Science.gov (United States)

    Yang, Yonggang

    2017-01-24

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

  2. Improved Pyrolysis Micro reactor Design via Computational Fluid Dynamics Simulations

    Science.gov (United States)

    2017-05-23

    NUMBER (Include area code) 23 May 2017 Briefing Charts 25 April 2017 - 23 May 2017 Improved Pyrolysis Micro-reactor Design via Computational Fluid...PYROLYSIS MICRO-REACTOR DESIGN VIA COMPUTATIONAL FLUID DYNAMICS SIMULATIONS Ghanshyam L. Vaghjiani* DISTRIBUTION A: Approved for public release...History of Micro-Reactor (Chen-Source) T ≤ 1800 K S.D. Chambreau et al./International Journal of Mass Spectrometry 2000, 199, 17–27 DISTRIBUTION A

  3. Computer simulations for thorium doped tungsten crystals

    Energy Technology Data Exchange (ETDEWEB)

    Eberhard, Bernd

    2009-07-17

    set of Langevin equations, i.e. stochastic differential equations including properly chosen ''noise'' terms. A new integration scheme is derived for integrating the equations of motion, which closely resembles the well-known Velocity Verlet algorithm. As a first application of the EAM potentials, we calculate the phonon dispersion for tungsten and thorium. Furthermore, the potentials are used to derive the excess volumes of point defects, i.e. for vacancies and Th-impurities in tungsten, grain boundary structures and energies. Additionally, we take a closer look at various stacking fault energies and link the results to the potential splitting of screw dislocations in tungsten into partials. We also compare the energetic stability of screw, edge and mixed-type dislocations. Besides this, we are interested in free enthalpy differences, for which we make use of the Overlapping Distribution Method (ODM), an efficient, albeit computationally demanding, method to calculate free enthalpy differences, with which we address the question of lattice formation, vacancy formation and impurity formation at varying temperatures. (orig.)

  4. Quantum computer simulation using the CUDA programming model

    Science.gov (United States)

    Gutiérrez, Eladio; Romero, Sergio; Trenas, María A.; Zapata, Emilio L.

    2010-02-01

    Quantum computing emerges as a field that captures a great theoretical interest. Its simulation represents a problem with high memory and computational requirements which makes advisable the use of parallel platforms. In this work we deal with the simulation of an ideal quantum computer on the Compute Unified Device Architecture (CUDA), as such a problem can benefit from the high computational capacities of Graphics Processing Units (GPU). After all, modern GPUs are becoming very powerful computational architectures which is causing a growing interest in their application for general purpose. CUDA provides an execution model oriented towards a more general exploitation of the GPU allowing to use it as a massively parallel SIMT (Single-Instruction Multiple-Thread) multiprocessor. A simulator that takes into account memory reference locality issues is proposed, showing that the challenge of achieving a high performance depends strongly on the explicit exploitation of memory hierarchy. Several strategies have been experimentally evaluated obtaining good performance results in comparison with conventional platforms.

  5. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  6. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    Science.gov (United States)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  7. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  9. Simulation of Turing Machine with uEAC-Computable Functions

    Directory of Open Access Journals (Sweden)

    Yilin Zhu

    2015-01-01

    Full Text Available The micro-Extended Analog Computer (uEAC is an electronic implementation inspired by Rubel’s EAC model. In this study, a fully connected uEACs array is proposed to overcome the limitations of a single uEAC, within which each uEAC unit is connected to all the other units by some weights. Then its computational capabilities are investigated by proving that a Turing machine M can be simulated with uEAC-computable functions, even in the presence of bounded noise.

  10. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  11. Multi-threaded, discrete event simulation of distributed computing systems

    Science.gov (United States)

    Legrand, Iosif; MONARC Collaboration

    2001-10-01

    The LHC experiments have envisaged computing systems of unprecedented complexity, for which is necessary to provide a realistic description and modeling of data access patterns, and of many jobs running concurrently on large scale distributed systems and exchanging very large amounts of data. A process oriented approach for discrete event simulation is well suited to describe various activities running concurrently, as well the stochastic arrival patterns specific for such type of simulation. Threaded objects or "Active Objects" can provide a natural way to map the specific behaviour of distributed data processing into the simulation program. The simulation tool developed within MONARC is based on Java (TM) technology which provides adequate tools for developing a flexible and distributed process oriented simulation. Proper graphics tools, and ways to analyze data interactively, are essential in any simulation project. The design elements, status and features of the MONARC simulation tool are presented. The program allows realistic modeling of complex data access patterns by multiple concurrent users in large scale computing systems in a wide range of possible architectures, from centralized to highly distributed. Comparison between queuing theory and realistic client-server measurements is also presented.

  12. Improving a Computer Networks Course Using the Partov Simulation Engine

    Science.gov (United States)

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  13. Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2011-01-01

    Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…

  14. Atomic Force Microscopy and Real Atomic Resolution. Simple Computer Simulations

    NARCIS (Netherlands)

    Koutsos, V.; Manias, E.; Brinke, G. ten; Hadziioannou, G.

    1994-01-01

    Using a simple computer simulation for AFM imaging in the contact mode, pictures with true and false atomic resolution are demonstrated. The surface probed consists of two f.c.c. (111) planes and an atomic vacancy is introduced in the upper layer. Changing the size of the effective tip and its

  15. Using computer simulations to improve concept formation in chemistry

    African Journals Online (AJOL)

    By incorporating more visual material into a chemistry lecture, the lecturer may succeed in restricting the overloading of the students' short-term memory, many a time the major factor leading to misconceptions. The goal of this research project was to investigate whether computer simulations used as a visually-supporting ...

  16. Computer Simulation of the Impact of Cigarette Smoking On Humans

    African Journals Online (AJOL)

    In this edition, emphasis has been laid on computer simulation of the impact of cigarette smoking on the population between now and the next 50 years, if no government intervention is exercised to control the behaviour of smokers. The statistical indices derived from the previous article (WAJIAR Volume 4) in the series ...

  17. Solving wood chip transport problems with computer simulation.

    Science.gov (United States)

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  18. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  19. Development of computer simulation models for pedestrian subsystem impact tests

    NARCIS (Netherlands)

    Kant, R.; Konosu, A.; Ishikawa, H.

    2000-01-01

    The European Enhanced Vehicle-safety Committee (EEVC/WG10 and WG17) proposed three component subsystem tests for cars to assess pedestrian protection. The objective of this study is to develop computer simulation models of the EEVC pedestrian subsystem tests. These models are available to develop a

  20. Computer simulation of cytoskeleton-induced blebbing in lipid membranes

    DEFF Research Database (Denmark)

    Spangler, E. J.; Harvey, C. W.; Revalee, J. D.

    2011-01-01

    Blebs are balloon-shaped membrane protrusions that form during many physiological processes. Using computer simulation of a particle-based model for self-assembled lipid bilayers coupled to an elastic meshwork, we investigated the phase behavior and kinetics of blebbing. We found that blebs form...

  1. Learner Perceptions of Realism and Magic in Computer Simulations.

    Science.gov (United States)

    Hennessy, Sara; O'Shea, Tim

    1993-01-01

    Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…

  2. Scaffolding learners in designing investigation assignments for a computer simulation

    NARCIS (Netherlands)

    Vreman-de Olde, Cornelise; de Jong, Anthonius J.M.

    2006-01-01

    This study examined the effect of scaffolding students who learned by designing assignments for a computer simulation on the physics topic of alternating circuits. We compared the students' assignments and the knowledge acquired in a scaffolded group (N=23) and a non-scaffolded group (N=19). The

  3. Biology Students Building Computer Simulations Using StarLogo TNG

    Science.gov (United States)

    Smith, V. Anne; Duncan, Ishbel

    2011-01-01

    Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

  4. Pedagogical Approaches to Teaching with Computer Simulations in Science Education

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Johan (CTIT); van Joolingen, Wouter; McBride, Ron; Searson, Michael

    2013-01-01

    For this study we interviewed 24 physics teachers about their opinions on teaching with computer simulations. The purpose of this study is to investigate whether it is possible to distinguish different types of teaching approaches. Our results indicate the existence of two types. The first type is

  5. The acoustical history of Hagia Sophia revived through computer simulations

    DEFF Research Database (Denmark)

    Rindel, Jens Holger; Weitze, C.A.; Christensen, Claus Lynge

    2002-01-01

    The present paper deals with acoustic computer simulations of Hagia Sophia, which is characterized not only by being one of the largest worship buildings in the world, but also by – in its 1500 year history – having served three purposes: as a church, as a mosque and today as a museum...

  6. Computer simulation study of water using a fluctuating charge model

    Indian Academy of Sciences (India)

    Unknown

    Abstract. Hydrogen bonding in small water clusters is studied through computer simulation methods using a sophisticated, empirical model of interaction developed by Rick et al (S W Rick, S J Stuart and B J Berne 1994 J. Chem. Phys. 101 6141) and others. The model allows for the charges on the interacting sites to ...

  7. COMPUTER SIMULATION OF DISPERSED MATERIALS MOTION IN ROTARY TILTING FURNACES

    Directory of Open Access Journals (Sweden)

    S. L. Rovin

    2016-01-01

    Full Text Available The article presents the results of computer simulation of dispersed materials motion in rotary furnaces with an inclined axis of rotation. Has been received new data on the dynamic layer work that enhances understanding of heat and mass transfer processes occurring in the layer. 

  8. Monte Carlo simulation by computer for life-cycle costing

    Science.gov (United States)

    Gralow, F. H.; Larson, W. J.

    1969-01-01

    Prediction of behavior and support requirements during the entire life cycle of a system enables accurate cost estimates by using the Monte Carlo simulation by computer. The system reduces the ultimate cost to the procuring agency because it takes into consideration the costs of initial procurement, operation, and maintenance.

  9. Computational Simulation of a Water-Cooled Heat Pump

    Science.gov (United States)

    Bozarth, Duane

    2008-01-01

    A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

  10. Simulating the immune response on a distributed parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Castiglione, F. [Univ. of Catania (Italy); Bernaschi, M. [Via Shanghai, Rome (Italy); Succi, S. [IAC/CNR, Rome (Italy)

    1997-06-01

    The application of ideas and methods of statistical mechanics to problems of biological relevance is one of the most promising frontiers of theoretical and computational mathematical physics. Among others, the computer simulation of the immune system dynamics stands out as one of the prominent candidates for this type of investigations. In the recent years immunological research has been drawing increasing benefits from the resort to advanced mathematical modeling on modern computers. Among others, Cellular Automata (CA), i.e., fully discrete dynamical systems evolving according to boolean laws, appear to be extremely well suited to computer simulation of biological systems. A prominent example of immunological CA is represented by the Celada-Seiden automaton, that has proven capable of providing several new insights into the dynamics of the immune system response. To date, the Celada-Seiden automaton was not in a position to exploit the impressive advances of computer technology, and notably parallel processing, simply because no parallel version of this automaton had been developed yet. In this paper we fill this gap and describe a parallel version of the Celada-Seiden cellular automaton aimed at simulating the dynamic response of the immune system. Details on the parallel implementation as well as performance data on the IBM SP2 parallel platform are presented and commented on.

  11. Computer-simulated development process of Chinese characters font cognition

    Science.gov (United States)

    Chen, Jing; Mu, Zhichun; Sun, Dehui; Hu, Dunli

    2008-10-01

    The research of Chinese characters cognition is an important research aspect of cognitive science and computer science, especially artificial intelligence. In this paper, according as the traits of Chinese characters the database of Chinese characters font representations and the model of computer simulation of Chinese characters font cognition are constructed from the aspect of cognitive science. The font cognition of Chinese characters is actual a gradual process and there is the accumulation of knowledge. Through using the method of computer simulation, the development model of Chinese characters cognition was constructed. And this is the important research content of Chinese characters cognition. This model is based on self-organizing neural network and adaptive resonance theory (ART) neural network. By Combining the SOFM and ART2 network, two sets of input were trained. Through training and testing methods, the development process of Chinese characters cognition based on Chinese characters cognition was simulated. Then the results from this model and could be compared with the results that were obtained only using SOFM. By analyzing the results, this simulation suggests that the model is able to account for some empirical results. So, the model can simulate the development process of Chinese characters cognition in a way.

  12. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  13. Computer simulation and image guidance for individualised dynamic spinal stabilization.

    Science.gov (United States)

    Kantelhardt, S R; Hausen, U; Kosterhon, M; Amr, A N; Gruber, K; Giese, A

    2015-08-01

    Dynamic implants for the human spine are used to re-establish regular segmental motion. However, the results have often been unsatisfactory and complications such as screw loosening are common. Individualisation of appliances and precision implantation are needed to improve the outcome of this procedure. Computer simulation, virtual implant optimisation and image guidance were used to improve the technique. A human lumbar spine computer model was developed using multi-body simulation software. The model simulates spinal motion under load and degenerative changes. After virtual degeneration of a L4/5 segment, virtual pedicle screw-based implants were introduced. The implants' positions and properties were iteratively optimised. The resulting implant positions were used as operative plan for image guidance and finally implemented in a physical spine model. In the simulation, the introduction and optimisation of virtually designed dynamic implants could partly compensate for the effects of virtual lumbar segment degeneration. The optimised operative plan was exported to two different image-guidance systems for transfer to a physical spine model. Three-dimensional computer graphic simulation is a feasible means to develop operative plans for dynamic spinal stabilization. These operative plans can be transferred to commercially available image-guidance systems for use in implantation of physical implants in a spine model. This concept has important potential in the design of operative plans and implants for individualised dynamic spine stabilization surgery.

  14. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    Science.gov (United States)

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  15. Computational cell biology: spatiotemporal simulation of cellular events.

    Science.gov (United States)

    Slepchenko, Boris M; Schaff, James C; Carson, John H; Loew, Leslie M

    2002-01-01

    The field of computational cell biology has emerged within the past 5 years because of the need to apply disciplined computational approaches to build and test complex hypotheses on the interacting structural, physical, and chemical features that underlie intracellular processes. To meet this need, newly developed software tools allow cell biologists and biophysicists to build models and generate simulations from them. The construction of general-purpose computational approaches is especially challenging if the spatial complexity of cellular systems is to be explicitly treated. This review surveys some of the existing efforts in this field with special emphasis on a system being developed in the authors' laboratory, Virtual Cell. The theories behind both stochastic and deterministic simulations are discussed. Examples of respective applications to cell biological problems in RNA trafficking and neuronal calcium dynamics are provided to illustrate these ideas.

  16. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  17. AFFECTIVE COMPUTING AND AUGMENTED REALITY FOR CAR DRIVING SIMULATORS

    Directory of Open Access Journals (Sweden)

    Dragoș Datcu

    2017-12-01

    Full Text Available Car simulators are essential for training and for analyzing the behavior, the responses and the performance of the driver. Augmented Reality (AR is the technology that enables virtual images to be overlaid on views of the real world. Affective Computing (AC is the technology that helps reading emotions by means of computer systems, by analyzing body gestures, facial expressions, speech and physiological signals. The key aspect of the research relies on investigating novel interfaces that help building situational awareness and emotional awareness, to enable affect-driven remote collaboration in AR for car driving simulators. The problem addressed relates to the question about how to build situational awareness (using AR technology and emotional awareness (by AC technology, and how to integrate these two distinct technologies [4], into a unique affective framework for training, in a car driving simulator.

  18. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  19. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  20. Two-dimensional computer simulation of high intensity proton beams

    CERN Document Server

    Lapostolle, Pierre M

    1972-01-01

    A computer program has been developed which simulates the two- dimensional transverse behaviour of a proton beam in a focusing channel. The model is represented by an assembly of a few thousand 'superparticles' acted upon by their own self-consistent electric field and an external focusing force. The evolution of the system is computed stepwise in time by successively solving Poisson's equation and Newton's law of motion. Fast Fourier transform techniques are used for speed in the solution of Poisson's equation, while extensive area weighting is utilized for the accurate evaluation of electric field components. A computer experiment has been performed on the CERN CDC 6600 computer to study the nonlinear behaviour of an intense beam in phase space, showing under certain circumstances a filamentation due to space charge and an apparent emittance growth. (14 refs).

  1. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... program and a building energy performance simulation program will improve both the energy consumption data and the prediction of thermal comfort and air quality in a selected area of the building....

  2. Subject-specific geometrical detail rather than cost function formulation affects hip loading calculation.

    Science.gov (United States)

    Wesseling, Mariska; De Groote, Friedl; Bosmans, Lode; Bartels, Ward; Meyer, Christophe; Desloovere, Kaat; Jonkers, Ilse

    2016-11-01

    This study assessed the relative importance of introducing an increasing level of medical image-based subject-specific detail in bone and muscle geometry in the musculoskeletal model, on calculated hip contact forces during gait. These forces were compared to introducing minimization of hip contact forces in the optimization criterion. With an increasing level of subject-specific detail, specifically MRI-based geometry and wrapping surfaces representing the hip capsule, hip contact forces decreased and were more comparable to contact forces measured using instrumented prostheses (average difference of 0.69 BW at the first peak compared to 1.04 BW for the generic model). Inclusion of subject-specific wrapping surfaces in the model had a greater effect than altering the cost function definition.

  3. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  4. Subject-specific musculoskeletal modeling in the evaluation of shoulder muscle and joint function.

    Science.gov (United States)

    Wu, Wen; Lee, Peter V S; Bryant, Adam L; Galea, Mary; Ackland, David C

    2016-11-07

    Upper limb muscle force estimation using Hill-type muscle models depends on musculotendon parameter values, which cannot be readily measured non-invasively. Generic and scaled-generic parameters may be quickly and easily employed, but these approaches do not account for an individual subject's joint torque capacity. The objective of the present study was to develop a subject-specific experimental testing and modeling framework to evaluate shoulder muscle and joint function during activities of daily living, and to assess the capacity of generic and scaled-generic musculotendon parameters to predict muscle and joint function. Three-dimensional musculoskeletal models of the shoulders of 6 healthy subjects were developed to calculate muscle and glenohumeral joint loading during abduction, flexion, horizontal flexion, nose touching and reaching using subject-specific, scaled-generic and generic musculotendon parameters. Muscle and glenohumeral joint forces calculated using generic and scaled-generic models were significantly different to those of subject-specific models (pMuscles in generic musculoskeletal models operated further from the plateau of their force-length curves than those of scaled-generic and subject-specific models, while muscles in subject-specific models operated over a wider region of their force length curves than those of the generic or scaled-generic models, reflecting diversity of subject shoulder strength. The findings of this study suggest that generic and scaled-generic musculotendon parameters may not provide sufficient accuracy in prediction of shoulder muscle and joint loading when compared to models that employ subject-specific parameter-estimation approaches. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging...... constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations...

  6. Computational Physics Simulation of Classical and Quantum Systems

    CERN Document Server

    Scherer, Philipp O. J

    2010-01-01

    This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills.

  7. OSL sensitivity changes during single aliquot procedures: Computer simulations

    DEFF Research Database (Denmark)

    McKeever, S.W.S.; Agersnap Larsen, N.; Bøtter-Jensen, L.

    1997-01-01

    We present computer simulations of sensitivity changes obtained during single aliquot, regeneration procedures. The simulations indicate that the sensitivity changes are the combined result of shallow trap and deep trap effects. Four separate processes have been identified. Although procedures can...... be suggested to eliminate the shallow trap effects, it appears that the deep trap effects cannot be removed. The character of the sensitivity changes which result from these effects is seen to be dependent upon several external parameters, including the extent of bleaching of the OSL signal, the laboratory...

  8. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  9. COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE

    Directory of Open Access Journals (Sweden)

    Constantin LUPU

    2015-07-01

    Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.

  10. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  11. Computational electronics semiclassical and quantum device modeling and simulation

    CERN Document Server

    Vasileska, Dragica; Klimeck, Gerhard

    2010-01-01

    Starting with the simplest semiclassical approaches and ending with the description of complex fully quantum-mechanical methods for quantum transport analysis of state-of-the-art devices, Computational Electronics: Semiclassical and Quantum Device Modeling and Simulation provides a comprehensive overview of the essential techniques and methods for effectively analyzing transport in semiconductor devices. With the transistor reaching its limits and new device designs and paradigms of operation being explored, this timely resource delivers the simulation methods needed to properly model state-of

  12. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  13. How to simulate a universal quantum computer using negative probabilities

    Science.gov (United States)

    Hofmann, Holger F.

    2009-07-01

    The concept of negative probabilities can be used to decompose the interaction of two qubits mediated by a quantum controlled-NOT into three operations that require only classical interactions (that is, local operations and classical communication) between the qubits. For a single gate, the probabilities of the three operations are 1, 1 and -1. This decomposition can be applied in a probabilistic simulation of quantum computation by randomly choosing one of the three operations for each gate and assigning a negative statistical weight to the outcomes of sequences with an odd number of negative probability operations. The maximal exponential speed-up of a quantum computer can then be evaluated in terms of the increase in the number of sequences needed to simulate a single operation of the quantum circuit.

  14. Active adaptive sound control in a duct - A computer simulation

    Science.gov (United States)

    Burgess, J. C.

    1981-09-01

    A digital computer simulation of adaptive closed-loop control for a specific application (sound cancellation in a duct) is discussed. The principal element is an extension of Sondhi's adaptive echo canceler and Widrow's adaptive noise canceler from signal processing to control. Thus, the adaptive algorithm is based on the LMS gradient search method. The simulation demonstrates that one or more pure tones can be canceled down to the computer bit noise level (-120 dB). When additive white noise is present, pure tones can be canceled to at least 10 dB below the noise spectrum level for SNRs down to at least 0 dB. The underlying theory suggests that the algorithm allows tracking tones with amplitudes and frequencies that change more slowly with time than the adaptive filter adaptation rate. It also implies that the method can cancel narrow-band sound in the presence of spectrally overlapping broadband sound.

  15. TLEM 2.0 - a comprehensive musculoskeletal geometry dataset for subject-specific modeling of lower extremity.

    Science.gov (United States)

    Carbone, V; Fluit, R; Pellikaan, P; van der Krogt, M M; Janssen, D; Damsgaard, M; Vigneron, L; Feilkas, T; Koopman, H F J M; Verdonschot, N

    2015-03-18

    When analyzing complex biomechanical problems such as predicting the effects of orthopedic surgery, subject-specific musculoskeletal models are essential to achieve reliable predictions. The aim of this paper is to present the Twente Lower Extremity Model 2.0, a new comprehensive dataset of the musculoskeletal geometry of the lower extremity, which is based on medical imaging data and dissection performed on the right lower extremity of a fresh male cadaver. Bone, muscle and subcutaneous fat (including skin) volumes were segmented from computed tomography and magnetic resonance images scans. Inertial parameters were estimated from the image-based segmented volumes. A complete cadaver dissection was performed, in which bony landmarks, attachments sites and lines-of-action of 55 muscle actuators and 12 ligaments, bony wrapping surfaces, and joint geometry were measured. The obtained musculoskeletal geometry dataset was finally implemented in the AnyBody Modeling System (AnyBody Technology A/S, Aalborg, Denmark), resulting in a model consisting of 12 segments, 11 joints and 21 degrees of freedom, and including 166 muscle-tendon elements for each leg. The new TLEM 2.0 dataset was purposely built to be easily combined with novel image-based scaling techniques, such as bone surface morphing, muscle volume registration and muscle-tendon path identification, in order to obtain subject-specific musculoskeletal models in a quick and accurate way. The complete dataset, including CT and MRI scans and segmented volume and surfaces, is made available at http://www.utwente.nl/ctw/bw/research/projects/TLEMsafe for the biomechanical community, in order to accelerate the development and adoption of subject-specific models on large scale. TLEM 2.0 is freely shared for non-commercial use only, under acceptance of the TLEMsafe Research License Agreement. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Molecular Dynamics Computer Simulations of Multidrug RND Efflux Pumps

    OpenAIRE

    Ruggerone, Paolo; Vargiu, Attilio V.; Collu, Francesca; Fischer, Nadine; Kandt, Christian

    2013-01-01

    Over-expression of multidrug efflux pumps of the Resistance Nodulation Division (RND) protein super family counts among the main causes for microbial resistance against pharmaceuticals. Understanding the molecular basis of this process is one of the major challenges of modern biomedical research, involving a broad range of experimental and computational techniques. Here we review the current state of RND transporter investigation employing molecular dynamics simulations providing conformation...

  17. Carburizer particle dissolution in liquid cast iron – computer simulation

    Directory of Open Access Journals (Sweden)

    D. Bartocha

    2010-01-01

    Full Text Available In the paper issue of dissolution of carburizing materials (anthracite, petroleum coke and graphite particle in liquid metal and its computer simulation are presented. Relative movement rate of particle and liquid metal and thermophsical properties of carburizing materials (thermal conductivity coefficient, specific heat, thermal diffusivity, density are taken into consideration in calculations. Calculations have been carried out in aspect of metal bath carburization in metallurgical furnaces.

  18. Computer simulation of carburizers particles heating in liquid metal

    Directory of Open Access Journals (Sweden)

    K. Janerka

    2010-01-01

    Full Text Available In this article are introduced the problems of computer simulation of carburizers particles heating (anthracite, graphite and petroleum coke, which are present in liquid metal. The diameter of particles, their quantity, relative velocity of particles and liquid metal and the thermophysical properties of materials (thermal conductivity, specific heat and thermal diffusivity have been taken into account in calculations. The analysis has been carried out in the aspect of liquid metal carburization in metallurgical furnaces.

  19. A computer-simulated Stern-Gerlach laboratory

    CERN Document Server

    Schroeder, Daniel V

    2015-01-01

    We describe an interactive computer program that simulates Stern-Gerlach measurements on spin-1/2 and spin-1 particles. The user can design and run experiments involving successive spin measurements, illustrating incompatible observables, interference, and time evolution. The program can be used by students at a variety of levels, from non-science majors in a general interest course to physics majors in an upper-level quantum mechanics course. We give suggested homework exercises using the program at various levels.

  20. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  1. Neurosurgical simulation by interactive computer graphics on iPad.

    Science.gov (United States)

    Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki

    2014-11-01

    Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.

  2. Computer Simulation of Intergranular Stress Corrosion Cracking via Hydrogen Embrittlement

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.W.

    2000-04-01

    Computer simulation has been applied to the investigation of intergranular stress corrosion cracking in Ni-based alloys based on a hydrogen embrittlement mechanism. The simulation employs computational modules that address (a) transport and reactions of aqueous species giving rise to hydrogen generation at the liquid-metal interface, (b) solid state transport of hydrogen via intergranular and transgranular diffusion pathways, and (c) fracture due to the embrittlement of metallic bonds by hydrogen. A key focus of the computational model development has been the role of materials microstructure (precipitate particles and grain boundaries) on hydrogen transport and embrittlement. Simulation results reveal that intergranular fracture is enhanced as grain boundaries are weakened and that microstructures with grains elongated perpendicular to the stress axis are more susceptible to cracking. The presence of intergranular precipitates may be expected to either enhance or impede cracking depending on the relative distribution of hydrogen between the grain boundaries and the precipitate-matrix interfaces. Calculations of hydrogen outgassing and in gassing demonstrate a strong effect of charging method on the fracture behavior.

  3. Computer modeling of road bridge for simulation moving load

    Directory of Open Access Journals (Sweden)

    Miličić Ilija M.

    2016-01-01

    Full Text Available In this paper is shown computational modelling one span road structures truss bridge with the roadway on the upper belt of. Calculation models were treated as planar and spatial girders made up of 1D finite elements with applications for CAA: Tower and Bridge Designer 2016 (2nd Edition. The conducted computer simulations results are obtained for each comparison of the impact of moving load according to the recommendations of the two standards SRPS and AASHATO. Therefore, it is a variant of the bridge structure modeling application that provides Bridge Designer 2016 (2nd Edition identical modeled in an environment of Tower. As important information for the selection of a computer applications point out that the application Bridge Designer 2016 (2nd Edition we arent unable to treat the impacts moving load model under national standard - V600. .

  4. COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD

    Directory of Open Access Journals (Sweden)

    Leonid Flehantov

    2017-03-01

    Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.

  5. An FPGA computing demo core for space charge simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  6. Sensitivity of subject-specific models to errors in musculo-skeletal geometry

    NARCIS (Netherlands)

    Carbone, Vincenzo; van der Krogt, Marjolein; Koopman, Hubertus F.J.M.; Verdonschot, Nicolaas Jacobus Joseph

    2012-01-01

    Subject-specific musculo-skeletal models of the lower extremity are an important tool for investigating various biomechanical problems, for instance the results of surgery such as joint replacements and tendon transfers. The aim of this study was to assess the potential effects of errors in

  7. A subject-specific musculoskeletal modeling framework to predict in vivo mechanics of total knee arthroplasty

    NARCIS (Netherlands)

    Marra, M.A.; Vanheule, V.; Fluit, R.; Koopman, B.H.; Rasmussen, J.; Verdonschot, N.J.; Andersen, M.S.

    2015-01-01

    Musculoskeletal (MS) models should be able to integrate patient-specific MS architecture and undergo thorough validation prior to their introduction into clinical practice. We present a methodology to develop subject-specific models able to simultaneously predict muscle, ligament, and knee joint

  8. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  9. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Christopher B.; Richmond, Marshall C.

    2001-05-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

  10. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm; Tryggvason, T.

    1998-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...... simulation program requires a detailed description of the energy flow in the air movement which can be obtained by a CFD program. The paper describes an energy consumption calculation in a large building, where the building energy simulation program is modified by CFD predictions of the flow between three...... zones connected by open areas with pressure and buoyancy driven air flow. The two programs are interconnected in an iterative procedure. The paper shows also an evaluation of the air quality in the main area of the buildings based on CFD predictions. It is shown that an interconnection between a CFD...

  11. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2017-01-01

    This textbook presents basic numerical methods and applies them to a large variety of physical models in multiple computer experiments. Classical algorithms and more recent methods are explained. Partial differential equations are treated generally comparing important methods, and equations of motion are solved by a large number of simple as well as more sophisticated methods. Several modern algorithms for quantum wavepacket motion are compared. The first part of the book discusses the basic numerical methods, while the second part simulates classical and quantum systems. Simple but non-trivial examples from a broad range of physical topics offer readers insights into the numerical treatment but also the simulated problems. Rotational motion is studied in detail, as are simple quantum systems. A two-level system in an external field demonstrates elementary principles from quantum optics and simulation of a quantum bit. Principles of molecular dynamics are shown. Modern bounda ry element methods are presented ...

  12. Scalable High Performance Computing: Direct and Large-Eddy Turbulent Flow Simulations Using Massively Parallel Computers

    Science.gov (United States)

    Morgan, Philip E.

    2004-01-01

    This final report contains reports of research related to the tasks "Scalable High Performance Computing: Direct and Lark-Eddy Turbulent FLow Simulations Using Massively Parallel Computers" and "Devleop High-Performance Time-Domain Computational Electromagnetics Capability for RCS Prediction, Wave Propagation in Dispersive Media, and Dual-Use Applications. The discussion of Scalable High Performance Computing reports on three objectives: validate, access scalability, and apply two parallel flow solvers for three-dimensional Navier-Stokes flows; develop and validate a high-order parallel solver for Direct Numerical Simulations (DNS) and Large Eddy Simulation (LES) problems; and Investigate and develop a high-order Reynolds averaged Navier-Stokes turbulence model. The discussion of High-Performance Time-Domain Computational Electromagnetics reports on five objectives: enhancement of an electromagnetics code (CHARGE) to be able to effectively model antenna problems; utilize lessons learned in high-order/spectral solution of swirling 3D jets to apply to solving electromagnetics project; transition a high-order fluids code, FDL3DI, to be able to solve Maxwell's Equations using compact-differencing; develop and demonstrate improved radiation absorbing boundary conditions for high-order CEM; and extend high-order CEM solver to address variable material properties. The report also contains a review of work done by the systems engineer.

  13. Subject-specific knee joint geometry improves predictions of medial tibiofemoral contact forces

    Science.gov (United States)

    Gerus, Pauline; Sartori, Massimo; Besier, Thor F.; Fregly, Benjamin J.; Delp, Scott L.; Banks, Scott A.; Pandy, Marcus G.; D’Lima, Darryl D.; Lloyd, David G.

    2013-01-01

    Estimating tibiofemoral joint contact forces is important for understanding the initiation and progression of knee osteoarthritis. However, tibiofemoral contact force predictions are influenced by many factors including muscle forces and anatomical representations of the knee joint. This study aimed to investigate the influence of subject-specific geometry and knee joint kinematics on the prediction of tibiofemoral contact forces using a calibrated EMG-driven neuromusculoskeletal model of the knee. One participant fitted with an instrumented total knee replacement walked at a self-selected speed while medial and lateral tibiofemoral contact forces, ground reaction forces, whole-body kinematics, and lower-limb muscle activity were simultaneously measured. The combination of generic and subject-specific knee joint geometry and kinematics resulted in four different OpenSim models used to estimate muscle-tendon lengths and moment arms. The subject-specific geometric model was created from CT scans and the subject-specific knee joint kinematics representing the translation of the tibia relative to the femur was obtained from fluoroscopy. The EMG-driven model was calibrated using one walking trial, but with three different cost functions that tracked the knee flexion/extension moments with and without constraint over the estimated joint contact forces. The calibrated models then predicted the medial and lateral tibiofemoral contact forces for five other different walking trials. The use of subject-specific models with minimization of the peak tibiofemoral contact forces improved the accuracy of medial contact forces by 47% and lateral contact forces by 7%, respectively compared with the use of generic musculoskeletal model. PMID:24074941

  14. Automatized spleen segmentation in non-contrast-enhanced MR volume data using subject-specific shape priors

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Bülow, Robin; Völzke, Henry

    2017-07-01

    To develop the first fully automated 3D spleen segmentation framework derived from T1-weighted magnetic resonance (MR) imaging data and to verify its performance for spleen delineation and volumetry. This approach considers the issue of low contrast between spleen and adjacent tissue in non-contrast-enhanced MR images. Native T1-weighted MR volume data was performed on a 1.5 T MR system in an epidemiological study. We analyzed random subsamples of MR examinations without pathologies to develop and verify the spleen segmentation framework. The framework is modularized to include different kinds of prior knowledge into the segmentation pipeline. Classification by support vector machines differentiates between five different shape types in computed foreground probability maps and recognizes characteristic spleen regions in axial slices of MR volume data. A spleen-shape space generated by training produces subject-specific prior shape knowledge that is then incorporated into a final 3D level set segmentation method. Individually adapted shape-driven forces as well as image-driven forces resulting from refined foreground probability maps steer the level set successfully to the segment the spleen. The framework achieves promising segmentation results with mean Dice coefficients of nearly 0.91 and low volumetric mean errors of 6.3%. The presented spleen segmentation approach can delineate spleen tissue in native MR volume data. Several kinds of prior shape knowledge including subject-specific 3D prior shape knowledge can be used to guide segmentation processes achieving promising results.

  15. Incorporation of Inter-Subject Information to Improve the Accuracy of Subject-Specific P300 Classifiers.

    Science.gov (United States)

    Xu, Minpeng; Liu, Jing; Chen, Long; Qi, Hongzhi; He, Feng; Zhou, Peng; Wan, Baikun; Ming, Dong

    2016-05-01

    Although the inter-subject information has been demonstrated to be effective for a rapid calibration of the P300-based brain-computer interface (BCI), it has never been comprehensively tested to find if the incorporation of heterogeneous data could enhance the accuracy. This study aims to improve the subject-specific P300 classifier by adding other subject's data. A classifier calibration strategy, weighted ensemble learning generic information (WELGI), was developed, in which elementary classifiers were constructed by using both the intra- and inter-subject information and then integrated into a strong classifier with a weight assessment. 55 subjects were recruited to spell 20 characters offline using the conventional P300-based BCI, i.e. the P300-speller. Four different metrics, the P300 accuracy and precision, the round accuracy, and the character accuracy, were performed for a comprehensive investigation. The results revealed that the classifier constructed on the training dataset in combination with adding other subject's data was significantly superior to that without the inter-subject information. Therefore, the WELGI is an effective classifier calibration strategy which uses the inter-subject information to improve the accuracy of subject-specific P300 classifiers, and could also be applied to other BCI paradigms.

  16. The Accuracy of 3D Optical Reconstruction and Additive Manufacturing Processes in Reproducing Detailed Subject-Specific Anatomy

    Directory of Open Access Journals (Sweden)

    Paolo Ferraiuoli

    2017-10-01

    Full Text Available 3D reconstruction and 3D printing of subject-specific anatomy is a promising technology for supporting clinicians in the visualisation of disease progression and planning for surgical intervention. In this context, the 3D model is typically obtained from segmentation of magnetic resonance imaging (MRI, computed tomography (CT or echocardiography images. Although these modalities allow imaging of the tissues in vivo, assessment of quality of the reconstruction is limited by the lack of a reference geometry as the subject-specific anatomy is unknown prior to image acquisition. In this work, an optical method based on 3D digital image correlation (3D-DIC techniques is used to reconstruct the shape of the surface of an ex vivo porcine heart. This technique requires two digital charge-coupled device (CCD cameras to provide full-field shape measurements and to generate a standard tessellation language (STL file of the sample surface. The aim of this work was to quantify the error of 3D-DIC shape measurements using the additive manufacturing process. The limitations of 3D printed object resolution, the discrepancy in reconstruction of the surface of cardiac soft tissue and a 3D printed model of the same surface were evaluated. The results obtained demonstrated the ability of the 3D-DIC technique to reconstruct localised and detailed features on the cardiac surface with sub-millimeter accuracy.

  17. Explicit contact modeling for surgical computer guidance and simulation

    Science.gov (United States)

    Johnsen, S. F.; Taylor, Z. A.; Clarkson, M.; Thompson, S.; Hu, M.; Gurusamy, K.; Davidson, B.; Hawkes, D. J.; Ourselin, S.

    2012-02-01

    Realistic modelling of mechanical interactions between tissues is an important part of surgical simulation, and may become a valuable asset in surgical computer guidance. Unfortunately, it is also computationally very demanding. Explicit matrix-free FEM solvers have been shown to be a good choice for fast tissue simulation, however little work has been done on contact algorithms for such FEM solvers. This work introduces such an algorithm that is capable of handling both deformable-deformable (soft-tissue interacting with soft-tissue) and deformable-rigid (e.g. soft-tissue interacting with surgical instruments) contacts. The proposed algorithm employs responses computed with a fully matrix-free, virtual node-based version of the model first used by Taylor and Flanagan in PRONTO3D. For contact detection, a bounding-volume hierarchy (BVH) capable of identifying self collisions is introduced. The proposed BVH generation and update strategies comprise novel heuristics to minimise the number of bounding volumes visited in hierarchy update and collision detection. Aside from speed, stability was a major objective in the development of the algorithm, hence a novel method for computation of response forces from C0-continuous normals, and a gradual application of response forces from rate constraints has been devised and incorporated in the scheme. The continuity of the surface normals has advantages particularly in applications such as sliding over irregular surfaces, which occurs, e.g., in simulated breathing. The effectiveness of the scheme is demonstrated on a number of meshes derived from medical image data and artificial test cases.

  18. Modelling and simulation of information systems on computer: methodological advantages.

    Science.gov (United States)

    Huet, B; Martin, J

    1980-01-01

    Modelling and simulation of information systems by the means of miniatures on computer aim at two general objectives: (a) as an aid to design and realization of information systems; and (b) a tool to improve the dialogue between the designer and the users. An operational information system has two components bound by a dynamic relationship, an information system and a behavioural system. Thanks to the behaviour system, modelling and simulation allow the designer to integrate into the projects a large proportion of the system's implicit specification. The advantages of modelling to the information system relate to: (a) The conceptual phase: initial objectives are compared with the results of simulation and sometimes modified. (b) The external specifications: simulation is particularly useful for personalising man-machine relationships in each application. (c) The internal specifications: if the miniatures are built on the concept of process, the global design and the software are tested and also the simulation refines the configuration and directs the choice of hardware. (d) The implementation: stimulation reduces costs, time and allows testing. Progress in modelling techniques will undoubtedly lead to better information systems.

  19. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

    Science.gov (United States)

    Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

    2012-01-01

    The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

  20. Numerical simulation of NQR/NMR: Applications in quantum computing.

    Science.gov (United States)

    Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C

    2011-04-01

    A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Computer simulation of methanol exchange dynamics around cations and anions

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Santanu; Dang, Liem X.

    2016-03-03

    In this paper, we present the first computer simulation of methanol exchange dynamics between the first and second solvation shells around different cations and anions. After water, methanol is the most frequently used solvent for ions. Methanol has different structural and dynamical properties than water, so its ion solvation process is different. To this end, we performed molecular dynamics simulations using polarizable potential models to describe methanol-methanol and ion-methanol interactions. In particular, we computed methanol exchange rates by employing the transition state theory, the Impey-Madden-McDonald method, the reactive flux approach, and the Grote-Hynes theory. We observed that methanol exchange occurs at a nanosecond time scale for Na+ and at a picosecond time scale for other ions. We also observed a trend in which, for like charges, the exchange rate is slower for smaller ions because they are more strongly bound to methanol. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.

  2. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim

    2004-04-28

    This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.

  3. Computer simulation of rapid crystal growth under microgravity

    Science.gov (United States)

    Hisada, Yasuhiro; Saito, Osami; Mitachi, Koshi; Nishinaga, Tatau

    We are planning to grow a Ge single crystal under microgravity by the TR-IA rocket in 1992. The furnace temperature should be controlled so as to finish the crystal growth in a quite short time interval (about 6 min). This study deals with the computer simulation of rapid crystal growth in space to find the proper conditions for the experiment. The crystal growth process is influenced by various physical phenomena such as heat conduction, natural and Marangoni convections, phase change, and radiation from the furnace. In this study, a 2D simulation with axial symmetry is carried out, taking into account the radiation field with a specific temperature distribution of the furnace wall. The simulation program consists of four modules. The first module is applied for the calculation of the parabolic partial differential equation by using the control volume method. The second one evaluates implicitly the phase change by the enthalpy method. The third one is for computing the heat flux from surface by radiation. The last one is for calculating with the Monte Carlo method the view factors which are necessary to obtain the heat flux.

  4. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-04-25

    This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.

  5. Incorporation of CT-based measurements of trunk anatomy into subject-specific musculoskeletal models of the spine influences vertebral loading predictions.

    Science.gov (United States)

    Bruno, Alexander G; Mokhtarzadeh, Hossein; Allaire, Brett T; Velie, Kelsey R; De Paolis Kaluza, M Clara; Anderson, Dennis E; Bouxsein, Mary L

    2017-10-01

    We created subject-specific musculoskeletal models of the thoracolumbar spine by incorporating spine curvature and muscle morphology measurements from computed tomography (CT) scans to determine the degree to which vertebral compressive and shear loading estimates are sensitive to variations in trunk anatomy. We measured spine curvature and trunk muscle morphology using spine CT scans of 125 men, and then created four different thoracolumbar spine models for each person: (i) height and weight adjusted (Ht/Wt models); (ii) height, weight, and spine curvature adjusted (+C models); (iii) height, weight, and muscle morphology adjusted (+M models); and (iv) height, weight, spine curvature, and muscle morphology adjusted (+CM models). We determined vertebral compressive and shear loading at three regions of the spine (T8, T12, and L3) for four different activities. Vertebral compressive loads predicted by the subject-specific CT-based musculoskeletal models were between 54% lower to 45% higher from those estimated using musculoskeletal models adjusted only for subject height and weight. The impact of subject-specific information on vertebral loading estimates varied with the activity and spinal region. Vertebral loading estimates were more sensitive to incorporation of subject-specific spinal curvature than subject-specific muscle morphology. Our results indicate that individual variations in spine curvature and trunk muscle morphology can have a major impact on estimated vertebral compressive and shear loads, and thus should be accounted for when estimating subject-specific vertebral loading. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:2164-2173, 2017. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  6. Computer simulation of confined and flexoelectric liquid crystalline systems

    CERN Document Server

    Barmes, F

    2003-01-01

    In this Thesis, systems of confined and flexoelectric liquid crystal systems have been studied using molecular computer simulations. The aim of this work was to provide a molecular model of a bistable display cell in which switching is induced through the application of directional electric field pulses. In the first part of this Thesis, the study of confined systems of liquid crystalline particles has been addressed. Computation of the anchoring phase diagrams for three different surface interaction models showed that the hard needle wall and rod-surface potentials induce both planar and homeotropic alignment separated by a bistability region, this being stronger and wider for the rod-surface varant. The results obtained using the rod-sphere surface model, in contrast, showed that tilled surface arrangements can be induced by surface absorption mechanisms. Equivalent studies of hybrid anchored systems showed that a bend director structure can be obtained in a slab with monostable homeotropic anchoring at the...

  7. Computational simulation of structural fracture in fiber composites

    Science.gov (United States)

    Chamis, C. C.; Murthy, P. L. N.

    1990-01-01

    A methodology was developed for the computational simulation of structural fracture in fiber composites. This methodology consists of step-by-step procedures for mixed mode fracture in generic components and of an integrated computer code, Composite Durability Structural Analysis (CODSTRAN). The generic types of composite structural fracture include single and combined mode fracture in beams, laminate free-edge delamination fracture, and laminate center flaw progressive fracture. Structural fracture is assessed in one or all of the following: (1) the displacements increase very rapidly; (2) the frequencies decrease very rapidly; (3) the buckling loads decrease very rapidly; or (4) the strain energy release rate increases very rapidly. These rapid changes are herein assumed to denote imminent structural fracture. Based on these rapid changes, parameters/guidelines are identified which can be used as criteria for structural fracture, inspection intervals, and retirement for cause.

  8. Subglacial sediment mechanics investigated by computer simulation of granular material

    DEFF Research Database (Denmark)

    Damsgaard, Anders; Egholm, David Lundbek; Tulaczyk, Slawek

    to the mechanical nonlinearity of the sediment, internal porosity changes during deformation, and associated structural and kinematic phase transitions. In this presentation, we introduce the Discrete Element Method (DEM) for particle-scale granular simulation. The DEM is fully coupled with fluid dynamics....... The numerical method is applied to better understand the mechanical properties of the subglacial sediment and its interaction with meltwater. The computational approach allows full experimental control and offers insights into the internal kinematics, stress distribution, and mechanical stability. During...... by linear-viscous sediment movement. We demonstrate how channel flanks are stabilized by the sediment frictional strength. Additionally, sediment liquefaction proves to be a possible mechanism for causing large and episodic sediment transport by water flow. Though computationally intense, our coupled...

  9. Computational strategies in the dynamic simulation of constrained flexible MBS

    Science.gov (United States)

    Amirouche, F. M. L.; Xie, M.

    1993-01-01

    This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.

  10. Approximate Bayesian computation methods for daily spatiotemporal precipitation occurrence simulation

    Science.gov (United States)

    Olson, Branden; Kleiber, William

    2017-04-01

    Stochastic precipitation generators (SPGs) produce synthetic precipitation data and are frequently used to generate inputs for physical models throughout many scientific disciplines. Especially for large data sets, statistical parameter estimation is difficult due to the high dimensionality of the likelihood function. We propose techniques to estimate SPG parameters for spatiotemporal precipitation occurrence based on an emerging set of methods called Approximate Bayesian computation (ABC), which bypass the evaluation of a likelihood function. Our statistical model employs a thresholded Gaussian process that reduces to a probit regression at single sites. We identify appropriate ABC penalization metrics for our model parameters to produce simulations whose statistical characteristics closely resemble those of the observations. Spell length metrics are appropriate for single sites, while a variogram-based metric is proposed for spatial simulations. We present numerical case studies at sites in Colorado and Iowa where the estimated statistical model adequately reproduces local and domain statistics.

  11. Computer simulation of aqueous Na-Cl electrolytes

    Energy Technology Data Exchange (ETDEWEB)

    Hummer, G. [Los Alamos National Lab., NM (United States); Soumpasis, D.M. [Max-Planck-Institut fuer Biophysikalische Chemie (Karl-Friedrich-Bonhoeffer-Institut), Goettingen (Germany); Neumann, M. [Vienna Univ. (Austria). Inst. fuer Experimentalphysik

    1993-11-01

    Equilibrium structure of aqueous Na-Cl electrolytes between 1 and 5 mol/l is studied by means of molecular dynamics computer simulation using interaction site descriptions of water and ionic components. Electrostatic interactions are treated both with the newly developed charged-clouds scheme and with Ewald summation. In the case of a 5 mol/l electrolyte, the results for pair correlations obtained by the two methods are in excellent agreement. However, the charged-clouds technique is much faster than Ewald summation and makes simulations at lower salt concentrations feasible. It is found that both ion-water and ion-ion correlation functions depend only weakly on the ionic concentration. Sodium and chloride ions exhibit only a negligible tendency to form contact pairs. In particular, no chloride ion pairs in contact are observed.

  12. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  13. Simulation of computed radiography with imaging plate detectors

    Science.gov (United States)

    Tisseur, D.; Costin, M.; Mathy, F.; Schumm, A.

    2014-02-01

    Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.

  14. Insights from molecular dynamics simulations for computational protein design.

    Science.gov (United States)

    Childers, Matthew Carter; Daggett, Valerie

    2017-02-01

    A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures.

  15. Insights from molecular dynamics simulations for computational protein design

    Science.gov (United States)

    Childers, Matthew Carter; Daggett, Valerie

    2017-01-01

    A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures. PMID:28239489

  16. Nonlinear simulations with and computational issues for NIMROD

    Energy Technology Data Exchange (ETDEWEB)

    Sovinec, C.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  17. Time-partitioning simulation models for calculation on parallel computers

    Science.gov (United States)

    Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.

    1987-01-01

    A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.

  18. Time-partitioning simulation models for calculation of parallel computers

    Science.gov (United States)

    Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.

    1987-01-01

    A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.

  19. Dilbert-Peter model of organization effectiveness: computer simulations

    CERN Document Server

    Sobkowicz, Pawel

    2010-01-01

    We provide a technical report on a computer simulation of general effectiveness of a hierarchical organization depending on two main aspects: effects of promotion to managerial levels and efforts to self-promote of individual employees, reducing their actual productivity. The combination of judgment by appearance in the promotion to higher levels of hierarchy and the Peter Principle (which states that people are promoted to their level of incompetence) results in fast declines in effectiveness of the organization. The model uses a few synthetic parameters aimed at reproduction of realistic conditions in typical multilayer organizations.

  20. Computer simulations for biological aging and sexual reproduction

    Directory of Open Access Journals (Sweden)

    STAUFFER DIETRICH

    2001-01-01

    Full Text Available The sexual version of the Penna model of biological aging, simulated since 1996, is compared here with alternative forms of reproduction as well as with models not involving aging. In particular we want to check how sexual forms of life could have evolved and won over earlier asexual forms hundreds of million years ago. This computer model is based on the mutation-accumulation theory of aging, using bits-strings to represent the genome. Its population dynamics is studied by Monte Carlo methods.

  1. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Marko Hadjina

    2015-06-01

    Full Text Available In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of suggested methodology, is investigated and described regarding its special characteristics, advantages and reasons for application, especially in shipbuilding production process. Furthermore, simulation modeling basics were described as well as suggested methodology for production process procedure. Case study of suggested methodology application for designing a robotized profile fabrication production process line is demonstrated. Selected design solution, acquired with suggested methodology was evaluated through comparison with robotized profile cutting production line installation in a specific shipyard production process. Based on obtained data from real production the simulation model was further enhanced. Finally, on grounds of this research, results and droved conclusions, directions for further research are suggested.

  2. GENOA-PFA: Progressive Fracture in Composites Simulated Computationally

    Science.gov (United States)

    Murthy, Pappu L. N.

    2000-01-01

    GENOA-PFA is a commercial version of the Composite Durability Structural Analysis (CODSTRAN) computer program that simulates the progression of damage ultimately leading to fracture in polymer-matrix-composite (PMC) material structures under various loading and environmental conditions. GENOA-PFA offers several capabilities not available in other programs developed for this purpose, making it preferable for use in analyzing the durability and damage tolerance of complex PMC structures in which the fiber reinforcements occur in two- and three-dimensional weaves and braids. GENOA-PFA implements a progressive-fracture methodology based on the idea that a structure fails when flaws that may initially be small (even microscopic) grow and/or coalesce to a critical dimension where the structure no longer has an adequate safety margin to avoid catastrophic global fracture. Damage is considered to progress through five stages: (1) initiation, (2) growth, (3) accumulation (coalescence of propagating flaws), (4) stable propagation (up to the critical dimension), and (5) unstable or very rapid propagation (beyond the critical dimension) to catastrophic failure. The computational simulation of progressive failure involves formal procedures for identifying the five different stages of damage and for relating the amount of damage at each stage to the overall behavior of the deteriorating structure. In GENOA-PFA, mathematical modeling of the composite physical behavior involves an integration of simulations at multiple, hierarchical scales ranging from the macroscopic (lamina, laminate, and structure) to the microscopic (fiber, matrix, and fiber/matrix interface), as shown in the figure. The code includes algorithms to simulate the progression of damage from various source defects, including (1) through-the-thickness cracks and (2) voids with edge, pocket, internal, or mixed-mode delaminations.

  3. Image driven subject-specific finite element models of spinal biomechanics.

    Science.gov (United States)

    Zanjani-Pour, Sahand; Winlove, C Peter; Smith, Christopher W; Meakin, Judith R

    2016-04-11

    Finite element (FE) modelling is an established technique for investigating spinal biomechanics. Using image data to produce FE models with subject-specific geometry and displacement boundary conditions may help extend their use to the assessment spinal loading in individuals. Lumbar spine magnetic resonance images from nine participants in the supine, standing and sitting postures were obtained and 2D poroelastic FE models of the lumbar spine were created from the supine data. The rigid body translation and rotation of the vertebral bodies as the participant moved to standing or sitting were applied to the model. The resulting pore pressure in the centre of the L4/L5 disc was determined and the sensitivity to the material properties and vertebral body displacements was assessed. Although the limitations of using a 2D model mean the predicted pore pressures are unlikely to be accurate, the results showed that subject-specific variation in geometry and motion during postural change leads to variation in pore pressure. The model was sensitive to the Young׳s modulus of the annulus matrix, the permeability of the nucleus, and the vertical translation of the vertebrae. This study demonstrates the feasibility of using image data to drive subject-specific lumbar spine FE models and indicates where further development is required to provide a method for assessing spinal biomechanics in a wide range of individuals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Automatic tissue segmentation of neonate brain MR Images with subject-specific atlases

    Science.gov (United States)

    Cherel, Marie; Budin, Francois; Prastawa, Marcel; Gerig, Guido; Lee, Kevin; Buss, Claudia; Lyall, Amanda; Zaldarriaga Consing, Kirsten; Styner, Martin

    2015-03-01

    Automatic tissue segmentation of the neonate brain using Magnetic Resonance Images (MRI) is extremely important to study brain development and perform early diagnostics but is challenging due to high variability and inhomogeneity in contrast throughout the image due to incomplete myelination of the white matter tracts. For these reasons, current methods often totally fail or give unsatisfying results. Furthermore, most of the subcortical midbrain structures are misclassified due to a lack of contrast in these regions. We have developed a novel method that creates a probabilistic subject-specific atlas based on a population atlas currently containing a number of manually segmented cases. The generated subject-specific atlas is sharp and adapted to the subject that is being processed. We then segment brain tissue classes using the newly created atlas with a single-atlas expectation maximization based method. Our proposed method leads to a much lower failure rate in our experiments. The overall segmentation results are considerably improved when compared to using a non-subject-specific, population average atlas. Additionally, we have incorporated diffusion information obtained from Diffusion Tensor Images (DTI) to improve the detection of white matter that is not visible at this early age in structural MRI (sMRI) due to a lack of myelination. Although this necessitates the acquisition of an additional sequence, the diffusion information improves the white matter segmentation throughout the brain, especially for the mid-brain structures such as the corpus callosum and the internal capsule.

  5. Visualization of computer architecture simulation data for system-level design space exploration

    NARCIS (Netherlands)

    Taghavi, T.; Thompson, M.; Pimentel, A.D.

    2009-01-01

    System-level computer architecture simulations create large volumes of simulation data to explore alternative architectural solutions. Interpreting and drawing conclusions from this amount of simulation results can be extremely cumbersome. In other domains that also struggle with interpreting large

  6. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  7. Matched experimental and computational simulations of paintball eye impacts.

    Science.gov (United States)

    Kennedy, Eric A; Stitzel, Joel D; Duma, Stefan M

    2008-01-01

    Over 1200 paintball related eye injuries are treated every year in US emergency departments. These injuries can be manifested as irritation from paint splatter in the eye to catastrophic rupture of the globe. Using the Virginia Tech - Wake Forest University Eye Model, experimental paintball impacts were replicated and the experimental and computational results compared. A total of 10 paintball impacts were conducted from a range of 71.1 m/s to 112.5 m/s. All experimental tests resulted in rupture of the globe. The matched computational simulations also predicted near-failure or failure in each of the simulations, with a maximum principal stress of greater than 22.8 MPa in all scenarios, over 23 MPa for velocities above 73 m/s. Failure stress for the VT-WFU Eye Model is defined as 23 MPa. The current regulation velocity for paintballs of 91 m/s exceeds the tolerance of the eye to globe rupture and underscores the importance for eyewear in this sport.

  8. Optimization of suspension smelting technology by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lilius, K.; Jokilaakso, A.; Ahokainen, T.; Teppo, O.; Yang Yongxiang [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Materials Processing and Powder Metallurgy

    1996-12-31

    An industrial-scale flash smelting furnace and waste-heat boilers have been modelled by using commercial Computational-Fluid-Dynamics software. The work has proceeded from cold gas flow to heat transfer, combustion, and two-phase flow simulations. In the present study, the modelling task has been divided into three sub-models: (1) the concentrate burner, (2) the flash smelting furnace (reaction shaft and uptake shaft), and (3) the waste-heat boiler. For the concentrate burner, the flow of the process gas and distribution air together with the concentrate or a feed mixture was simulated. Eulerian - Eulerian approach was used for the carrier gas-phase and the dispersed particle-phase. A large parametric study was carried out by simulating a laboratory scale burner with varying turbulence intensities and then extending the simulations to the industrial scale model. For the flash smelting furnace, the simulation work concentrated on gas and gas-particle two-phase flows, as well as the development of combustion model for sulphide concentrate particles. Both Eulerian and Lagrangian approaches have been utilised in describing the particle phase and the spreading of the concentrate in the reaction shaft as well as the particle tracks have been obtained. Combustion of sulphides was first approximated with gaseous combustion by using a built-in combustion model of the software. The real oxidation reactions of the concentrate particles were then coded as a user-defined sub-routine and that was tested with industrial flash smelting cases. For the waste-heat boiler, both flow and heat transfer calculations have been carried out for an old boiler and a modified boiler SULA 2 Research Programme; 23 refs.

  9. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    Science.gov (United States)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the

  10. Computer Simulation of Hydraulic Systems with Typical Nonlinear Characteristics

    Directory of Open Access Journals (Sweden)

    D. N. Popov

    2017-01-01

    Full Text Available The task was to synthesise an adjustable hydraulic system structure, the mathematical model of which takes into account its inherent nonlinearity. Its solution suggests using a successive computer simulations starting with a structure of the linearized stable hydraulic system, which is then complicated by including the essentially non-linear elements. The hydraulic system thus obtained may be unable to meet the Lyapunov stability criterion and be unstable. This can be eliminated through correcting elements. Control of correction results is provided according to the form of transition processes due to stepwise variation of the control signal.Computer simulation of a throttle-controlled electrohydraulic servo drive with the rotary output element illustrates the proposed method application. A constant pressure power source provides fluid feed for the drive under pressure.For drive simulation the following models were involved: the linear model, the model taking into consideration a non-linearity of the flow-dynamic characteristics of a spool-type valve, and the non-linear models that take into account the dry friction in the spool-type valve, the backlash in the steering angle sensor of the motor shaft.The paper shows possibility of damping oscillation caused by variable hydrodynamic forces through introducing a correction device.The list of references attached contains 16 sources, which were used to justify and explain certain factors of the automatic control theory and the fluid mechanics of unsteady flows.The article presents 6 block-diagrams of the electrohydraulic servo drive and their appropriate transition processes, which have been studied.

  11. Computer Simulation of Embryonic Systems: What can a ...

    Science.gov (United States)

    (1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative pr

  12. Workbench for the computer simulation of underwater gated viewing systems

    Science.gov (United States)

    Braesicke, K.; Wegner, D.; Repasi, E.

    2017-05-01

    In this paper we introduce a software tool for image based computer simulation of an underwater gated viewing system. This development is helpful as a tool for the discussion of a possible engagement of a gated viewing camera for underwater imagery. We show the modular structure of implemented input parameter sets for camera, laser and environment description and application examples of the software tool. The whole simulation includes the scene illumination through a laser pulse with its energy pulse form and length as well as the propagation of the light through the open water taking into account complex optical properties of the environment. The scene is modeled as a geometric shape with diverse reflective areas and optical surface properties submerged in the open water. The software is based on a camera model including image degradation due to diffraction, lens transmission, detector efficiency and image enhancement by digital signal processing. We will show simulation results on some example configurations. Finally we will discuss the limits of our method and give an outlook to future development.

  13. Value stream mapping in a computational simulation model

    Directory of Open Access Journals (Sweden)

    Ricardo Becker Mendes de Oliveira

    2014-08-01

    Full Text Available The decision-making process has been extensively studied by researchers and executives. This paper aims to use the methodology of Value Stream Mapping (VSM in an integrated manner with a computer simulation model, in order to expand managers decision-making vision. The object of study is based on a production system that involves a process of automatic packaging of products, where it became necessary to implement changes in order to accommodate new products, so that the detection of bottlenecks and the visualization of impacts generated by future modifications are necessary. The simulation aims to support manager’s decision considering that the system involves several variables and their behaviors define the complexity of the process. Significant reduction in project costs by anticipating their behavior, together with the results of the Value Stream Mapping to identify activities that add value or not for the process were the main results. The validation of the simulation model will occur with the current map of the system and with the inclusion of Kaizen events so that waste in future maps are found in a practical and reliable way, which could support decision-makings.

  14. Protein adsorption on nanoparticles: model development using computer simulation.

    Science.gov (United States)

    Shao, Qing; Hall, Carol K

    2016-10-19

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  15. Protein adsorption on nanoparticles: model development using computer simulation

    Science.gov (United States)

    Shao, Qing; Hall, Carol K.

    2016-10-01

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  16. Computational simulation of bone fracture healing under inverse dynamisation.

    Science.gov (United States)

    Wilson, Cameron J; Schütz, Michael A; Epari, Devakara R

    2017-02-01

    Adaptive finite element models have allowed researchers to test hypothetical relationships between the local mechanical environment and the healing of bone fractures. However, their predictive power has not yet been demonstrated by testing hypotheses ahead of experimental testing. In this study, an established mechano-biological scheme was used in an iterative finite element simulation of sheep tibial osteotomy healing under a hypothetical fixation regime, "inverse dynamisation". Tissue distributions, interfragmentary movement and stiffness across the fracture site were compared between stiff and flexible fixation conditions and scenarios in which fixation stiffness was increased at a discrete time-point. The modelling work was conducted blind to the experimental study to be published subsequently. The simulations predicted the fastest and most direct healing under constant stiff fixation, and the slowest healing under flexible fixation. Although low fixation stiffness promoted more callus formation prior to bridging, this conferred little additional stiffness to the fracture in the first 5 weeks. Thus, while switching to stiffer fixation facilitated rapid subsequent bridging of the fracture, no advantage of inverse dynamisation could be demonstrated. In vivo data remains necessary to conclusively test this treatment protocol and this will, in turn, provide an evaluation of the model's performance. The publication of both hypotheses and their computational simulation, prior to experimental testing, offers an appealing means to test the predictive power of mechano-biological models.

  17. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-04-30

    This is the sixth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of our IGCC workbench. Preliminary CFD simulations for single stage and two stage ''generic'' gasifiers using firing conditions based on the Vision 21 reference configuration have been performed. Work is continuing on implementing an advanced slagging model into the CFD based gasifier model. An investigation into published gasification kinetics has highlighted a wide variance in predicted performance due to the choice of kinetic parameters. A plan has been outlined for developing the reactor models required to simulate the heat transfer and gas clean up equipment downstream of the gasifier. Three models that utilize the CCA software protocol have been integrated into a version of the IGCC workbench. Tests of a CCA implementation of our CFD code into the workbench demonstrated that the CCA CFD module can execute on a geographically remote PC (linked via the Internet) in a manner that is transparent to the user. Software tools to create ''walk-through'' visualizations of the flow field within a gasifier have been demonstrated.

  18. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison

    2002-01-31

    This is the fifth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, our efforts have become focused on developing an improved workbench for simulating a gasifier based Vision 21 energyplex. To provide for interoperability of models developed under Vision 21 and other DOE programs, discussions have been held with DOE and other organizations developing plant simulator tools to review the possibility of establishing a common software interface or protocol to use when developing component models. A component model that employs the CCA protocol has successfully been interfaced to our CCA enabled workbench. To investigate the software protocol issue, DOE has selected a gasifier based Vision 21 energyplex configuration for use in testing and evaluating the impacts of different software interface methods. A Memo of Understanding with the Cooperative Research Centre for Coal in Sustainable Development (CCSD) in Australia has been completed that will enable collaborative research efforts on gasification issues. Preliminary results have been obtained for a CFD model of a pilot scale, entrained flow gasifier. A paper was presented at the Vision 21 Program Review Meeting at NETL (Morgantown) that summarized our accomplishments for Year One and plans for Year Two and Year Three.

  19. Computer simulations of equilibrium magnetization and microstructure in magnetic fluids

    Science.gov (United States)

    Rosa, A. P.; Abade, G. C.; Cunha, F. R.

    2017-09-01

    In this work, Monte Carlo and Brownian Dynamics simulations are developed to compute the equilibrium magnetization of a magnetic fluid under action of a homogeneous applied magnetic field. The particles are free of inertia and modeled as hard spheres with the same diameters. Two different periodic boundary conditions are implemented: the minimum image method and Ewald summation technique by replicating a finite number of particles throughout the suspension volume. A comparison of the equilibrium magnetization resulting from the minimum image approach and Ewald sums is performed by using Monte Carlo simulations. The Monte Carlo simulations with minimum image and lattice sums are used to investigate suspension microstructure by computing the important radial pair-distribution function go(r), which measures the probability density of finding a second particle at a distance r from a reference particle. This function provides relevant information on structure formation and its anisotropy through the suspension. The numerical results of go(r) are compared with theoretical predictions based on quite a different approach in the absence of the field and dipole-dipole interactions. A very good quantitative agreement is found for a particle volume fraction of 0.15, providing a validation of the present simulations. In general, the investigated suspensions are dominated by structures like dimmer and trimmer chains with trimmers having probability to form an order of magnitude lower than dimmers. Using Monte Carlo with lattice sums, the density distribution function g2(r) is also examined. Whenever this function is different from zero, it indicates structure-anisotropy in the suspension. The dependence of the equilibrium magnetization on the applied field, the magnetic particle volume fraction, and the magnitude of the dipole-dipole magnetic interactions for both boundary conditions are explored in this work. Results show that at dilute regimes and with moderate dipole

  20. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Wangda [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McNeil, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wetter, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lee, Eleanor S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach was evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.

  1. Simulation of branching blood flows on parallel computers.

    Science.gov (United States)

    Yue, Xue; Hwang, Feng-Nan; Shandas, Robin; Cai, Xiao-Chuan

    2004-01-01

    We present a fully parallel nonlinearly implicit algorithm for the numerical simulation of some branching blood flow problems, which require efficient and robust solver technologies in order to handle the high nonlinearity and the complex geometry. Parallel processing is necessary because of the large number of mesh points needed to accurately discretize the system of differential equations. In this paper we introduce a parallel Newton-Krylov-Schwarz based implicit method, and software for distributed memory parallel computers, for solving the nonlinear algebraic systems arising from a Q2-Q1 finite element discretization of the incompressible Navier-Stokes equations that we use to model the blood flow in the left anterior descending coronary artery.

  2. Application of Computer Simulation Modeling to Medication Administration Process Redesign

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2012-01-01

    Full Text Available The medication administration process (MAP is one of the most high-risk processes in health care. MAP workflow redesign can precipitate both unanticipated and unintended consequences that can lead to new medication safety risks and workflow inefficiencies. Thus, it is necessary to have a tool to evaluate the impact of redesign approaches in advance of their clinical implementation. This paper discusses the development of an agent-based MAP computer simulation model that can be used to assess the impact of MAP workflow redesign on MAP performance. The agent-based approach is adopted in order to capture Registered Nurse medication administration performance. The process of designing, developing, validating, and testing such a model is explained. Work is underway to collect MAP data in a hospital setting to provide more complex MAP observations to extend development of the model to better represent the complexity of MAP.

  3. Computer simulation of an industrial wastewater treatment process

    Energy Technology Data Exchange (ETDEWEB)

    Jenke, D.R.; Diebold, F.E.

    1985-01-01

    The computer program REDEQL.EPAK has been modified to allow for the prediction and simulation of the chemical effects of mixing 2 or more aqueous solutions and one or more solid phases. In this form the program is capable of modelling the lime neutralisation treatment process for acid mine waters. The program calculates the speciation of all influent solutions, evaluates the equilibrium composition of any mixed solution and provides the stoichiometry of the liquid and solid phases produced as a result of the mixing. The program is used to predict the optimum treatment effluent composition, to determine the amount of neutralising agent (lime) required to produce this optimum composition and to provide information which defines the mechanism controlling the treatment process.

  4. Agent-based computer simulations of language choice dynamics.

    Science.gov (United States)

    Hadzibeganovic, Tarik; Stauffer, Dietrich; Schulze, Christian

    2009-06-01

    We use agent-based Monte Carlo simulations to address the problem of language choice dynamics in a tripartite community that is linguistically homogeneous but politically divided. We observe the process of nonlocal pattern formation that causes populations to self-organize into stable antagonistic groups as a result of the local dynamics of attraction and influence between individual computational agents. Our findings uncover some of the unique properties of opinion formation in social groups when the process is affected by asymmetric noise distribution, unstable intergroup boundaries, and different migratory behaviors. Although we focus on one particular study, the proposed stochastic dynamic models can be easily generalized and applied to investigate the evolution of other complex and nonlinear features of human collective behavior.

  5. Computer simulation of randomly cross-linked polymer networks

    CERN Document Server

    Williams, T P

    2002-01-01

    In this work, Monte Carlo and Stochastic Dynamics computer simulations of mesoscale model randomly cross-linked networks were undertaken. Task parallel implementations of the lattice Monte Carlo Bond Fluctuation model and Kremer-Grest Stochastic Dynamics bead-spring continuum model were designed and used for this purpose. Lattice and continuum precursor melt systems were prepared and then cross-linked to varying degrees. The resultant networks were used to study structural changes during deformation and relaxation dynamics. The effects of a random network topology featuring a polydisperse distribution of strand lengths and an abundance of pendant chain ends, were qualitatively compared to recent published work. A preliminary investigation into the effects of temperature on the structural and dynamical properties was also undertaken. Structural changes during isotropic swelling and uniaxial deformation, revealed a pronounced non-affine deformation dependant on the degree of cross-linking. Fractal heterogeneiti...

  6. Simulating Smoke Filling in Big Halls by Computational Fluid Dynamics

    Directory of Open Access Journals (Sweden)

    W. K. Chow

    2011-01-01

    Full Text Available Many tall halls of big space volume were built and, to be built in many construction projects in the Far East, particularly Mainland China, Hong Kong, and Taiwan. Smoke is identified to be the key hazard to handle. Consequently, smoke exhaust systems are specified in the fire code in those areas. An update on applying Computational Fluid Dynamics (CFD in smoke exhaust design will be presented in this paper. Key points to note in CFD simulations on smoke filling due to a fire in a big hall will be discussed. Mathematical aspects concerning of discretization of partial differential equations and algorithms for solving the velocity-pressure linked equations are briefly outlined. Results predicted by CFD with different free boundary conditions are compared with those on room fire tests. Standards on grid size, relaxation factors, convergence criteria, and false diffusion should be set up for numerical experiments with CFD.

  7. COMPUTER EMULATORS AND SIMULATORS OFMEASURING INSTRUMENTS ON THE PHYSICS LESSONS

    Directory of Open Access Journals (Sweden)

    Yaroslav Yu. Dyma

    2010-10-01

    Full Text Available Prominent feature of educational physical experiment at the present stage is applications of computer equipment and special software – virtual measuring instruments. The purpose of this article – to explain, when by means of virtual instruments it is possible to lead real experience (they are emulators, and when – virtual (they are simulators. For the best understanding implementation of one laboratory experimentation with usage of the software of both types is given. As at learning physics advantage should be given to carrying out of natural experiment with learning the real phenomena and measuring of real physical quantities the most perspective examination of programs-emulators of measuring instruments for their further implantation in educational process sees.

  8. Computer simulation of cluster impact induced electronic excitation of solids

    Energy Technology Data Exchange (ETDEWEB)

    Weidtmann, B.; Hanke, S.; Duvenbeck, A. [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany); Wucher, A., E-mail: andreas.wucher@uni-deu.de [Fakultät für Physik, Universität Duisburg-Essen, 47048 Duisburg (Germany)

    2013-05-15

    We present a computational study of electronic excitation upon bombardment of a metal surface with cluster projectiles. Our model employs a molecular dynamics (MD) simulation to calculate the particle dynamics following the projectile impact. Kinetic excitation is implemented via two mechanisms describing the electronic energy loss of moving particles: autoionization in close binary collisions and a velocity proportional friction force resulting from direct atom–electron collisions. Two different friction models are compared with respect to the predicted sputter yields after single atom and cluster bombardment. We find that a density dependent friction coefficient leads to a significant reduction of the total energy transferred to the electronic sub-system as compared to the Lindhard friction model, thereby strongly enhancing the predicted sputter yield under cluster bombardment conditions. In contrast, the yield predicted for monoatomic projectile bombardment remains practically unchanged.

  9. Mixed-Language High-Performance Computing for Plasma Simulations

    Directory of Open Access Journals (Sweden)

    Quanming Lu

    2003-01-01

    Full Text Available Java is receiving increasing attention as the most popular platform for distributed computing. However, programmers are still reluctant to embrace Java as a tool for writing scientific and engineering applications due to its still noticeable performance drawbacks compared with other programming languages such as Fortran or C. In this paper, we present a hybrid Java/Fortran implementation of a parallel particle-in-cell (PIC algorithm for plasma simulations. In our approach, the time-consuming components of this application are designed and implemented as Fortran subroutines, while less calculation-intensive components usually involved in building the user interface are written in Java. The two types of software modules have been glued together using the Java native interface (JNI. Our mixed-language PIC code was tested and its performance compared with pure Java and Fortran versions of the same algorithm on a Sun E6500 SMP system and a Linux cluster of Pentium~III machines.

  10. Computer simulation of chemical reactions in porous materials

    Science.gov (United States)

    Turner, Christoffer Heath

    Understanding reactions in nanoporous materials from a purely experimental perspective is a difficult task. Measuring the chemical composition of a reacting system within a catalytic material is usually only accomplished through indirect methods, and it is usually impossible to distinguish between true chemical equilibrium and metastable states. In addition, measuring molecular orientation or distribution profiles within porous systems is not easily accomplished. However, molecular simulation techniques are well-suited to these challenges. With appropriate simulation techniques and realistic molecular models, it is possible to validate the dominant physical and chemical forces controlling nanoscale reactivity. Novel nanostructured catalysts and supports can be designed, optimized, and tested using high-performance computing and advanced modeling techniques in order to guide the search for next-generation catalysts---setting new targets for the materials synthesis community. We have simulated the conversion of several different equilibrium-limited reactions within microporous carbons and we find that the pore size, pore geometry, and surface chemistry are important factors for determining the reaction yield. The equilibrium-limited reactions that we have modeled include nitric oxide dimerization, ammonia synthesis, and the esterification of acetic acid, all of which show yield enhancements within microporous carbons. In conjunction with a yield enhancement of the esterification reaction, selective adsorption of ethyl acetate within carbon micropores demonstrates an efficient method for product recovery. Additionally, a new method has been developed for simulating reaction kinetics within porous materials and other heterogeneous environments. The validity of this technique is first demonstrated by reproducing the kinetics of hydrogen iodide decomposition in the gas phase, and then predictions are made within slit-shaped carbon pores and carbon nanotubes. The rate

  11. Petascale computation of multi-physics seismic simulations

    Science.gov (United States)

    Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.

    2017-04-01

    Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential

  12. Computer simulations for minds-on learning with ``Project Spectra!''

    Science.gov (United States)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  13. Filter wheel equalization for chest radiography: a computer simulation.

    Science.gov (United States)

    Boone, J M; Duryea, J; Steiner, R M

    1995-07-01

    A chest radiographic equalization system using lung-shaped templates mounted on filter wheels is under development. Using this technique, 25 lung templates for each lung are available on two computer controlled wheels which are located in close proximity to the x-ray tube. The large magnification factor (> 10X) of the templates assures low-frequency equalization due to the blurring of the focal spot. A low-dose image is acquired without templates using a (generic) digital receptor, the image is analyzed, and the left and right lung fields are automatically identified using software developed for this purpose. The most appropriate left and right lung templates are independently selected and are positioned into the field of view at the proper location under computer control. Once the templates are positioned, acquisition of the equalized radiographic image onto film commences at clinical exposure levels. The templates reduce the exposure to the lung fields by attenuating a fraction of the incident x-ray fluence so that the exposure to the mediastinum and diaphragm areas can be increased without overexposing the lungs. A data base of 824 digitized chest radiographs was used to determine the shape of the specific lung templates, for both left and right lung fields. A second independent data base of 208 images was used to test the performance of the templates using computer simulations. The template shape characteristics derived from the clinical image data base are demonstrated. The detected exposure in the lung fields on conventional chest radiographs was found to be, on average, three times the detected exposure behind the diaphragm and mediastinum.(ABSTRACT TRUNCATED AT 250 WORDS)

  14. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim

    2004-01-28

    This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.

  15. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    Science.gov (United States)

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  16. Ultrasound fusion image error correction using subject-specific liver motion model and automatic image registration.

    Science.gov (United States)

    Yang, Minglei; Ding, Hui; Zhu, Lei; Wang, Guangzhi

    2016-12-01

    Ultrasound fusion imaging is an emerging tool and benefits a variety of clinical applications, such as image-guided diagnosis and treatment of hepatocellular carcinoma and unresectable liver metastases. However, respiratory liver motion-induced misalignment of multimodal images (i.e., fusion error) compromises the effectiveness and practicability of this method. The purpose of this paper is to develop a subject-specific liver motion model and automatic registration-based method to correct the fusion error. An online-built subject-specific motion model and automatic image registration method for 2D ultrasound-3D magnetic resonance (MR) images were combined to compensate for the respiratory liver motion. The key steps included: 1) Build a subject-specific liver motion model for current subject online and perform the initial registration of pre-acquired 3D MR and intra-operative ultrasound images; 2) During fusion imaging, compensate for liver motion first using the motion model, and then using an automatic registration method to further correct the respiratory fusion error. Evaluation experiments were conducted on liver phantom and five subjects. In the phantom study, the fusion error (superior-inferior axis) was reduced from 13.90±2.38mm to 4.26±0.78mm by using the motion model only. The fusion error further decreased to 0.63±0.53mm by using the registration method. The registration method also decreased the rotation error from 7.06±0.21° to 1.18±0.66°. In the clinical study, the fusion error was reduced from 12.90±9.58mm to 6.12±2.90mm by using the motion model alone. Moreover, the fusion error decreased to 1.96±0.33mm by using the registration method. The proposed method can effectively correct the respiration-induced fusion error to improve the fusion image quality. This method can also reduce the error correction dependency on the initial registration of ultrasound and MR images. Overall, the proposed method can improve the clinical practicability of

  17. Computer simulation of viral-assembly and translocation

    Science.gov (United States)

    Mahalik, Jyoti Prakash

    We investigated four different problems using coarse grained computational models : self-assembly of single stranded (ss) DNA virus, ejection dynamics of double stranded(ds) DNA from phages, translocation of ssDNA through MspA protein pore, and segmental dynamics of a polymer translocating through a synthetic nanopore. In the first part of the project, we investigated the self-assembly of a virus with and without its genome. A coarse-grained model was proposed for the viral subunit proteins and its genome (ssDNA). Langevin dynamics simulation, and replica exchange method were used to determine the kinetics and energetics of the self-assembly process, respectively. The self-assembly follows a nucleation-growth kind of mechanism. The ssDNA plays a crucial role in the self-assembly by acting as a template and enhancing the local concentration of the subunits. The presence of the genome does not changes the mechanism of the self-assembly but it reduces the nucleation time and enhances the growth rate by almost an order of magnitude. The second part of the project involves the investigation of the dynamics of the ejection of dsDNA from phages. A coarse-grained model was used for the phage and dsDNA. Langevin dynamics simulation was used to investigate the kinetics of the ejection. The ejection is a stochastic process and a slow intermediate rate kinetics was observed for most ejection trajectories. We discovered that the jamming of the DNA at the pore mouth at high packing fraction and for a disordered system is the reason for the intermediate slow kinetics. The third part of the project involves translocation of ssDNA through MspA protein pore. MspA protein pore has the potential for genome sequencing because of its ability to clearly distinguish the four different nucleotides based on their blockade current, but it is a challenge to use this pore for any practical application because of the very fast traslocation time. We resolved the state of DNA translocation

  18. Subject-specific modelling of lower limb muscles in children with cerebral palsy.

    Science.gov (United States)

    Oberhofer, K; Stott, N S; Mithraratne, K; Anderson, I A

    2010-01-01

    Recent studies suggest that the architecture of spastic muscles in children with cerebral palsy is considerably altered; however, only little is known about the structural changes that occur other than in the gastrocnemius muscle. In the present study, Magnetic Resonance Imaging (MRI) and subject-specific modelling techniques were used to compare the lengths and volumes of six lower limb muscles between children with cerebral palsy and typically developing children. MRI scans of the lower limbs of two children with spastic hemiplegia cerebral palsy, four children with spastic diplegia cerebral palsy (mean age 9.6 years) and a group of typically developing children (mean age 10.2 years) were acquired. Subject-specific models of six lower limb muscles were developed from the MRI data using a technique called Face Fitting. Muscle volumes and muscle lengths were derived from the models and normalised to body mass and segmental lengths, respectively. Normalised muscle volumes in the children with cerebral palsy were smaller than in the control group with the difference being 22% in the calf muscles, 26% in the hamstrings and 22% in the quadriceps, respectively. Only the differences in the hamstrings and the quadriceps were statistically significant (P=0.036, P=0.038). Normalised muscle lengths in the children with cerebral palsy were significantly shorter (Pmuscle in either group. The present results show that lower limb muscles in ambulatory children with cerebral palsy are significantly altered, suggesting an overall mechanical deficit due to predominant muscle atrophy. Further investigations of the underlying causes of the muscle atrophy are required to better define management and treatment strategies for children with cerebral palsy.

  19. Validation of subject-specific cardiovascular system models from porcine measurements.

    Science.gov (United States)

    Revie, James A; Stevenson, David J; Chase, J Geoffrey; Hann, Christopher E; Lambermont, Bernard C; Ghuysen, Alexandre; Kolh, Philippe; Shaw, Geoffrey M; Heldmann, Stefan; Desaive, Thomas

    2013-02-01

    A previously validated mathematical model of the cardiovascular system (CVS) is made subject-specific using an iterative, proportional gain-based identification method. Prior works utilised a complete set of experimentally measured data that is not clinically typical or applicable. In this paper, parameters are identified using proportional gain-based control and a minimal, clinically available set of measurements. The new method makes use of several intermediary steps through identification of smaller compartmental models of CVS to reduce the number of parameters identified simultaneously and increase the convergence stability of the method. This new, clinically relevant, minimal measurement approach is validated using a porcine model of acute pulmonary embolism (APE). Trials were performed on five pigs, each inserted with three autologous blood clots of decreasing size over a period of four to five hours. All experiments were reviewed and approved by the Ethics Committee of the Medical Faculty at the University of Liege, Belgium. Continuous aortic and pulmonary artery pressures (P(ao), P(pa)) were measured along with left and right ventricle pressure and volume waveforms. Subject-specific CVS models were identified from global end diastolic volume (GEDV), stroke volume (SV), P(ao), and P(pa) measurements, with the mean volumes and maximum pressures of the left and right ventricles used to verify the accuracy of the fitted models. The inputs (GEDV, SV, P(ao), P(pa)) used in the identification process were matched by the CVS model to errors pressures not used to fit the model compared experimental measurements to median absolute errors of 4.3% and 4.4%, which are equivalent to the measurement errors of currently used monitoring devices in the ICU (∼5-10%). These results validate the potential for implementing this approach in the intensive care unit. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  20. Supercoiled DNA energetics and dynamics by computer simulation.

    Science.gov (United States)

    Schlick, T; Olson, W K

    1992-02-20

    A new formulation is presented for investigating supercoiled DNA configurations by deterministic techniques. Thus far, the computational difficulties involved in applying deterministic methods to supercoiled DNA studies have generally limited computer simulations to stochastic approaches. While stochastic methods, such as simulated annealing and Metropolis-Monte Carlo sampling, are successful at generating a large number of configurations and estimating thermodynamic properties of topoisomer ensembles, deterministic methods offer an accurate characterization of the minima and a systematic following of their dynamics. To make this feasible, we model circular duplex DNA compactly by a B-spline ribbon-like model in terms of a small number of control vertices. We associate an elastic deformation energy composed of bending and twisting integrals and represent intrachain contact by a 6-12 Lennard Jones potential. The latter is parameterized to yield an energy minimum at the observed DNA-helix diameter inclusive of a hydration shell. A penalty term to ensure fixed contour length is also included. First and second partial derivatives of the energy function have been derived by using various mathematical simplifications. First derivatives are essential for Newton-type minimization as well as molecular dynamics, and partial second-derivative information can significantly accelerate minimization convergence through preconditioning. Here we apply a new large-scale truncated-Newton algorithm for minimization and a Langevin/implicit-Euler scheme for molecular dynamics. Our truncated-Newton method exploits the separability of potential energy functions into terms of differing complexity. It relies on a preconditioned conjugate gradient method that is efficient for large-scale problems to solve approximately for the search direction at every step. Our dynamics algorithm is numerically stable over large time steps. It also introduces a frequency-discriminating mechanism so that

  1. "Simulated molecular evolution" or computer-generated artifacts?

    Science.gov (United States)

    Darius, F; Rojas, R

    1994-11-01

    1. The authors define a function with value 1 for the positive examples and 0 for the negative ones. They fit a continuous function but do not deal at all with the error margin of the fit, which is almost as large as the function values they compute. 2. The term "quality" for the value of the fitted function gives the impression that some biological significance is associated with values of the fitted function strictly between 0 and 1, but there is no justification for this kind of interpretation and finding the point where the fit achieves its maximum does not make sense. 3. By neglecting the error margin the authors try to optimize the fitted function using differences in the second, third, fourth, and even fifth decimal place which have no statistical significance. 4. Even if such a fit could profit from more data points, the authors should first prove that the region of interest has some kind of smoothness, that is, that a continuous fit makes any sense at all. 5. "Simulated molecular evolution" is a misnomer. We are dealing here with random search. Since the margin of error is so large, the fitted function does not provide statistically significant information about the points in search space where strings with cleavage sites could be found. This implies that the method is a highly unreliable stochastic search in the space of strings, even if the neural network is capable of learning some simple correlations. 6. Classical statistical methods are for these kind of problems with so few data points clearly superior to the neural networks used as a "black box" by the authors, which in the way they are structured provide a model with an error margin as large as the numbers being computed.7. And finally, even if someone would provide us with a function which separates strings with cleavage sites from strings without them perfectly, so-called simulated molecular evolution would not be better than random selection.Since a perfect fit would only produce exactly ones or

  2. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language

    National Research Council Canada - National Science Library

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-01-01

    .... In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments...

  3. Subglacial sediment mechanics investigated by computer simulation of granular material

    Science.gov (United States)

    Damsgaard, A.; Egholm, D. L.; Tulaczyk, S. M.; Piotrowski, J. A.; Larsen, N. K.; Siegfried, M. R.; Beem, L.; Suckale, J.

    2016-12-01

    The mechanical properties of subglacial sediments are known to directly influence the stability of ice streams and fast-moving glaciers, but existing models of granular sediment deformation are poorly constrained. In addition, upscaling to generalized mathematical models is difficult due to the mechanical nonlinearity of the sediment, internal porosity changes during deformation, and associated structural and kinematic phase transitions. In this presentation, we introduce the Discrete Element Method (DEM) for particle-scale granular simulation. The DEM is fully coupled with fluid dynamics. The numerical method is applied to better understand the mechanical properties of the subglacial sediment and its interaction with meltwater. The computational approach allows full experimental control and offers insights into the internal kinematics, stress distribution, and mechanical stability. During confined shear with variable pore-water pressure, the sediment changes mechanical behavior, from stick, to non-linear creep, and unconstrained failure during slip. These results are contrary to more conventional models of plastic or (non-)linear viscous subglacial soft-bed sliding. Advection of sediment downstream is pressure dependent, which is consistent with theories of unstable bed bump growth. Granular mechanics prove to significantly influence the geometry and hydraulic properties of meltwater channels incised into the subglacial bed. Current models assume that channel bed erosion is balanced by linear-viscous sediment movement. We demonstrate how channel flanks are stabilized by the sediment frictional strength. Additionally, sediment liquefaction proves to be a possible mechanism for causing large and episodic sediment transport by water flow. Though computationally intense, our coupled numerical method provides a framework for quantifying a wide range of subglacial sediment-water processes, which are a key unknown in our ability to model the future evolution of ice

  4. Computational model for simulation small testing launcher, technical solution

    Energy Technology Data Exchange (ETDEWEB)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro [University POLITEHNICA of Bucharest - Research Center for Aeronautics and Space, Str. Ghe Polizu, nr. 1, Bucharest, Sector 1 (Romania); Cristian, Barbu, E-mail: barbucr@mta.ro [Military Technical Academy, Romania, B-dul. George Coşbuc, nr. 81-83, Bucharest, Sector 5 (Romania); Chelaru, Adrian, E-mail: achelaru@incas.ro [INCAS -National Institute for Aerospace Research Elie Carafoli, B-dul Iuliu Maniu 220, 061126, Bucharest, Sector 6 (Romania)

    2014-12-10

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital

  5. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    Science.gov (United States)

    2017-08-01

    formulation of automated computing.1 Modern computational science involves the use of digital computers to solve mathematical models of various...Control 31 5.4 Physical Invariance 34 6. Conclusion 40 7. References 41 Appendix A. Partial CTH Input for Baseline Simulation 45 Appendix B...Scientific investigation can be broadly grouped into 3 domains: experimental, theoretical, and computational. Experimental science involves direct

  6. Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators

    National Research Council Canada - National Science Library

    Ling, Hao

    2000-01-01

    This report summarizes the scientific progress on the research grant "Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators" during the period 1 December...

  7. Annual Report on Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators

    National Research Council Canada - National Science Library

    Ling, Hao

    1998-01-01

    This report summarizes the scientific progress on the research grant "Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators" during the period 1 December...

  8. Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators

    National Research Council Canada - National Science Library

    Ling, Hao

    1999-01-01

    This report summarizes the scientific progress on the research grant "Application of Model-Based Signal Processing Methods to Computational Electromagnetics Simulators" during the period 1 December...

  9. An Instructional Theory for the Design of Computer-Based Simulations.

    Science.gov (United States)

    Reigeluth, Charles M.; Schwartz, Ellen

    1989-01-01

    Description of computer-based simulations focuses on the instructional functions of simulations and phases in the learning processes that should be activated by educational simulations. Highlights include a general model for simulation design; degree of learner control; complexity of the content; the role of the learner; and motivational…

  10. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    Science.gov (United States)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  11. 3D Computational Simulation of Calcium Leaching in Cement Matrices

    Directory of Open Access Journals (Sweden)

    Gaitero, J. J.

    2014-12-01

    Full Text Available Calcium leaching is a degradation process consisting in progressive dissolution of the cement paste by migration of calcium atoms to the aggressive solution. It is therefore, a complex phenomenon involving several phases and dissolution and diffusion processes simultaneously. Along this work, a new computational scheme for the simulation of the degradation process in three dimensions was developed and tested. The toolkit was used to simulate accelerated calcium leaching by a 6M ammonium nitrate solution in cement matrices. The obtained outputs were the three dimensional representation of the matrix and the physicochemical properties of individual phases as a consequence of the degradation process. This not only makes it possible to study the evolution of such properties as a function of time but also as a function of the position within the matrix. The obtained results are in good agreement with experimental values of the elastic modulus in degraded and undegraded samples.El lixiviado de calcio es un proceso de degradación consistente en la disolución progresiva de la pasta de cemento por la migración de los átomos de calcio a la disolución agresiva. Se trata por tanto de un fenómeno complejo que involucra simultáneamente diferentes fases y procesos de disolución y difusión. En este trabajo se desarrolló y probó una nueva herramienta computacional para la simulación del proceso de degradación en tres dimensiones. Para ello se simuló el lixiviado de calcio acelerado provocado por una disolución de nitrato amónico 6M en matrices de cemento. Como resultado se obtuvieron la representación tridimensional de la matriz y las propiedades físico-químicas sus fases a lo largo del tiempo. Esto permitió estudiar la evolución de dichas propiedades a lo largo del proceso de degradación así como en función de su posición dentro de la matriz. Los resultados obtenidos coinciden con los valores experimentales del módulo elástico tanto

  12. Scheduling of a computer integrated manufacturing system: A simulation study

    Directory of Open Access Journals (Sweden)

    Nadia Bhuiyan

    2011-12-01

    Full Text Available Purpose: The purpose of this paper is to study the effect of selected scheduling dispatching rules on the performance of an actual CIM system using different performance measures and to compare the results with the literature.Design/methodology/approach: To achieve this objective, a computer simulation model of the existing CIM system is developed to test the performance of different scheduling rules with respect to mean flow time, machine efficiency and total run time as performance measures.Findings: Results suggest that the system performs much better considering the machine efficiency when the initial number of parts released is maximum and the buffer size is minimum. Furthermore, considering the average flow time, the system performs much better when the selected dispatching rule is either Earliest Due Date (EDD or Shortest Process Time (SPT with buffer size of five and the initial number of parts released of eight.Research limitations/implications: In this research, some limitations are: a limited number of factors and levels were considered for the experiment set-up; however the flexibility of the model allows experimenting with additional factors and levels. In the simulation experiments of this research, three scheduling dispatching rules (First In/First Out (FIFO, EDD, SPT were used. In future research, the effect of other dispatching rules on the system performance can be compared. Some assumptions can be relaxed in future work.Practical implications: This research helps to identify the potential effect of a selected number of dispatching rules and two other factors, the number of buffers and initial number of parts released, on the performance of the existing CIM systems with different part types where the machines are the major resource constraints.Originality/value: This research is among the few to study the effect of the dispatching rules on the performance of the CIM systems with use of terminating simulation analysis. This is

  13. Nonuniform Moving Boundary Method for Computational Fluid Dynamics Simulation of Intrathecal Cerebrospinal Flow Distribution in a Cynomolgus Monkey.

    Science.gov (United States)

    Khani, Mohammadreza; Xing, Tao; Gibbs, Christina; Oshinski, John N; Stewart, Gregory R; Zeller, Jillynne R; Martin, Bryn A

    2017-08-01

    A detailed quantification and understanding of cerebrospinal fluid (CSF) dynamics may improve detection and treatment of central nervous system (CNS) diseases and help optimize CSF system-based delivery of CNS therapeutics. This study presents a computational fluid dynamics (CFD) model that utilizes a nonuniform moving boundary approach to accurately reproduce the nonuniform distribution of CSF flow along the spinal subarachnoid space (SAS) of a single cynomolgus monkey. A magnetic resonance imaging (MRI) protocol was developed and applied to quantify subject-specific CSF space geometry and flow and define the CFD domain and boundary conditions. An algorithm was implemented to reproduce the axial distribution of unsteady CSF flow by nonuniform deformation of the dura surface. Results showed that maximum difference between the MRI measurements and CFD simulation of CSF flow rates was flow along the entire spine was laminar with a peak Reynolds number of ∼150 and average Womersley number of ∼5.4. Maximum CSF flow rate was present at the C4-C5 vertebral level. Deformation of the dura ranged up to a maximum of 134 μm. Geometric analysis indicated that total spinal CSF space volume was ∼8.7 ml. Average hydraulic diameter, wetted perimeter, and SAS area were 2.9 mm, 37.3 mm and 27.24 mm2, respectively. CSF pulse wave velocity (PWV) along the spine was quantified to be 1.2 m/s.

  14. Simulations of an Optical Tactile Sensor Based on Computer Tomography

    Science.gov (United States)

    Ohka, Masahiro; Sawamoto, Yasuhiro; Zhu, Ning

    In order to create a robotic tactile sensor of thin shape, a new optical tactile sensor is developed by applying a CT (Computer Tomography) algorithm. The present tactile sensor is comprised of infrared emitting diode arrays, receiving phototransistor arrays and a transparent acrylic plate and a black rubber sheet with projections. Infrared rays emitted from the diode array are directed into one end of the plate and their intensity distribution is measured by the phototransistor array mounted on the other end. If the CT algorithm is directly applied to the tactile sensor, there are two shortcomings: the shape of the sensing area is limited to a circular region and there is a long calculation time. Thus, a new CT algorithm oriented to tactile sensing is proposed for overcoming these problems. In the present algorithm, a square sensing area is divided into an N-by-N array and algebraic equations are derived from the relationship between the input and output light intensities on the assumed light projections. Several reconstruction methods are considered for obtaining pressure values caused in the squares. In the present study, the ART (Algebraic Reconstruction Technique) and LU decomposition methods were employed, and these methods were compared to select the best reconstruction method. In a series of simulations, it was found that the LU decomposition method held an advantage for the present type of tactile sensor because of its robustness against disturbance and short calculation time.

  15. Experimental validation of a computer simulation of radiographic film

    Energy Technology Data Exchange (ETDEWEB)

    Goncalves, Elicardo A. de S., E-mail: elicardo.goncalves@ifrj.edu.br [Instituto Federal do Rio de Janeiro (IFRJ), Paracambi, RJ (Brazil). Laboratorio de Instrumentacao e Simulacao Computacional Cientificas Aplicadas; Azeredo, Raphaela, E-mail: raphaelaazeredo@yahoo.com.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica Armando Dias Tavares. Programa de Pos-Graduacao em Fisica; Assis, Joaquim T., E-mail: joaquim@iprj.uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Instituto Politecnico; Anjos, Marcelino J. dos; Oliveira, Davi F.; Oliveira, Luis F. de, E-mail: marcelin@uerj.br, E-mail: davi.oliveira@uerj.br, E-mail: lfolive@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Instituto de Fisica Armando Dias Tavares. Departamento de Fisica Aplicada e Termodinamica

    2015-07-01

    In radiographic films, the behavior of characteristic curve is very important for the image quality. Digitization/visualization are always performed by light transmission and the characteristic curve is known as a behavior of optical density in function of exposure. In a first approach, in a Monte-Carlo computer simulation trying to build a Hurter-Driffield curve by a stochastic model, the results showed the same known shape, but some behaviors, like the influence of silver grain size, are not expected. A real H and D curve was build exposing films, developing and measuring the optical density. When comparing model results with a real curve, trying to fit them and estimating some parameters, a difference in high exposure region shows a divergence between the models and the experimental data. Since the optical density is a function of metallic silver generated by chemical development, direct proportion was considered, but the results suggests a limitation in this proportion. In fact, when the optical density was changed by another way to measure silver concentration, like x-ray fluorescence, the new results agree with the models. Therefore, overexposed films can contain areas with different silver concentrations but it can't be seen due to the fact that optical density measurement is limited. Mapping the silver concentration in the film area can be a solution to reveal these dark images, and x-ray fluorescence has shown to be the best way to perform this new way to digitize films. (author)

  16. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  17. Computer simulation of ECG manifestations of left ventricular electrical remodeling.

    Science.gov (United States)

    Bacharova, Ljuba; Szathmary, Vavrinec; Potse, Mark; Mateasik, Anton

    2012-01-01

    An increased QRS voltage is considered to be specific for the electrocardiogram (ECG) diagnosis of left ventricular hypertrophy (LVH). However, the QRS-complex patterns in patients with LVH cover a broader spectrum: increased QRS voltage, prolonged QRS duration, left axis deviation, and left anterior fascicular block- and left bundle branch block-like patterns, as well as pseudo-normal QRS patterns. The classical interpretation of the QRS patterns in LVH relates these changes to increased left ventricular mass (LVM) per se, while tending to neglect the modified active and passive electrical properties of the myocardium. However, it has been well documented that both active and passive electrical properties in LVH are altered. Using computer simulations, we have shown that an increased LVM is not the only determinant of QRS complex changes in LVH, as these changes could also be produced without changing the left ventricular mass, implying that these QRS patterns can be present in patients before their LVM exceeds the arbitrary upper normal limits. Our results link the experimental evidence on electrical remodeling with clinical interpretation of ECG changes in patients with LVH and stress the necessity of a complex interpretation of the QRS patterns considering both spatial and nonspatial determinants in terms of the spatial angle theory. We assume that hypertrophic electrical remodeling in combination with changes in left ventricular size and shape explains the variety of ECG patterns as well as the discrepancies between ECG and left ventricular mass. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Three-dimensional analysis of talar trochlea morphology: Implications for subject-specific kinematics of the talocrural joint.

    Science.gov (United States)

    Nozaki, Shuhei; Watanabe, Kota; Katayose, Masaki

    2016-11-01

    Three-dimensional (3D) behavior of the talocrural joint is primarily determined by the articular surface morphology of the talar trochlea and tibiofibular mortise. However, morphological features of the anterior and posterior regions of the talar trochlea remain unclear. The objectives of this study were to evaluate anterior and posterior radii of the medial and lateral talar trochlea and to estimate subject-specific kinematics of the talocrural joint. Fifty dry tali were scanned using computed tomography to create 3D bone models. Radii of curvature of the anterior and posterior region at both the medial and lateral trochlea were calculated. Orientations of the dorsiflexion and plantarflexion axis passing through the centers of the circles fitted to the anterior region of the medial and lateral trochlea and through the centers of the circles fitted to the posterior region of the medial and lateral trochlea were evaluated, respectively. The anterior radius of the medial trochlea was significantly smaller than that of the lateral trochlea by a mean of 7.8 mm (P dorsiflexion, whereas bilateral asymmetric shape of posterior trochlea would induce opposite axial rotations among subjects during ankle plantarflexion, which would help the physical therapists to restore talocrural joint motions to ideal state for patients with ankle injuries. Clin. Anat. 29:1066-1074, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Computers in engineering 1983; Proceedings of the International Conference and Exhibit, Chicago, IL, August 7-11, 1983. Volume 1 - Computer-aided design, manufacturing, and simulation

    Science.gov (United States)

    Cokonis, T. J.

    The papers presented in this volume provide examples of the impact of computers on present engineering practice and indicate some future trends in computer-aided design, manufacturing, and simulation. Topics discussed include computer-aided design of turbine cycle configuration, managing and development of engineering computer systems, computer-aided manufacturing with robots in the automotive industry, and computer-aided design/analysis techniques of composite materials in the cure phase. Papers are also presented on computer simulation of vehicular propulsion systems, the performance of a hydraulic system simulator in a CAD environment, and computer simulation of hovercraft heave dynamics and control.

  20. The relationship between porosity and specific surface in human cortical bone is subject specific.

    Science.gov (United States)

    Lerebours, C; Thomas, C D L; Clement, J G; Buenzli, P R; Pivonka, P

    2015-03-01

    A characteristic relationship for bone between bone volume fraction (BV/TV) and specific surface (BS/TV) has previously been proposed based on 2D histological measurements. This relationship has been suggested to be bone intrinsic, i.e., to not depend on bone type, bone site and health state. In these studies, only limited data comes from cortical bone. The aim of this paper was to investigate the relationship between BV/TV and BS/TV in human cortical bone using high-resolution micro-CT imaging and the correlations with subject-specific biometric data such as height, weight, age and sex. Images from femoral cortical bone samples of the Melbourne Femur Collection were obtained using synchrotron radiation micro-CT (SPring8, Japan). Sixteen bone samples from thirteen individuals were analysed in order to find bone volume fraction values ranging from 0.20 to 1. Finally, morphological models of the tissue microstructure were developed to help explain the relationship between BV/TV and BS/TV. Our experimental findings indicate that the BV/TV vs BS/TV relationship is subject specific rather than intrinsic. Sex and pore density were statistically correlated with the individual curves. However no correlation was found with body height, weight or age. Experimental cortical data points deviate from interpolating curves previously proposed in the literature. However, these curves are largely based on data points from trabecular bone samples. This finding challenges the universality of the curve: highly porous cortical bone is significantly different to trabecular bone of the same porosity. Finally, our morphological models suggest that changes in BV/TV within the same sample can be explained by an increase in pore area rather than in pore density. This is consistent with the proposed mechanisms of age-related endocortical bone loss. In addition, these morphological models highlight that the relationship between BV/TV and BS/TV is not linear at high BV/TV as suggested in the

  1. In-human subject-specific evaluation of a control-theoretic plasma volume regulation model.

    Science.gov (United States)

    Bighamian, Ramin; Kinsky, Michael; Kramer, George; Hahn, Jin-Oh

    2017-12-01

    The goal of this study was to conduct a subject-specific evaluation of a control-theoretic plasma volume regulation model in humans. We employed a set of clinical data collected from nine human subjects receiving fluid bolus with and without co-administration of an inotrope agent, including fluid infusion rate, plasma volume, and urine output. Once fitted to the data associated with each subject, the model accurately reproduced the fractional plasma volume change responses in all subjects: the error between actual versus model-reproduced fractional plasma volume change responses was only 1.4 ± 1.6% and 1.2 ± 0.3% of the average fractional plasma volume change responses in the absence and presence of inotrope co-administration. In addition, the model parameters determined by the subject-specific fitting assumed physiologically plausible values: (i) initial plasma volume was estimated to be 36 ± 11 mL/kg and 37 ± 10 mL/kg in the absence and presence of inotrope infusion, respectively, which was comparable to its actual counterpart of 37 ± 4 mL/kg and 43 ± 6 mL/kg; (ii) volume distribution ratio, specifying the ratio with which the inputted fluid is distributed in the intra- and extra-vascular spaces, was estimated to be 3.5 ± 2.4 and 1.9 ± 0.5 in the absence and presence of inotrope infusion, respectively, which accorded with the experimental observation that inotrope could enhance plasma volume expansion in response to fluid infusion. We concluded that the model was equipped with the ability to reproduce plasma volume response to fluid infusion in humans with physiologically plausible model parameters, and its validity may persist even under co-administration of inotropic agents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  3. Computational Fluid Dynamics Simulation of Fluidized Bed Polymerization Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Rong [Iowa State Univ., Ames, IA (United States)

    2006-01-01

    Fluidized beds (FB) reactors are widely used in the polymerization industry due to their superior heat- and mass-transfer characteristics. Nevertheless, problems associated with local overheating of polymer particles and excessive agglomeration leading to FB reactors defluidization still persist and limit the range of operating temperatures that can be safely achieved in plant-scale reactors. Many people have been worked on the modeling of FB polymerization reactors, and quite a few models are available in the open literature, such as the well-mixed model developed by McAuley, Talbot, and Harris (1994), the constant bubble size model (Choi and Ray, 1985) and the heterogeneous three phase model (Fernandes and Lona, 2002). Most these research works focus on the kinetic aspects, but from industrial viewpoint, the behavior of FB reactors should be modeled by considering the particle and fluid dynamics in the reactor. Computational fluid dynamics (CFD) is a powerful tool for understanding the effect of fluid dynamics on chemical reactor performance. For single-phase flows, CFD models for turbulent reacting flows are now well understood and routinely applied to investigate complex flows with detailed chemistry. For multiphase flows, the state-of-the-art in CFD models is changing rapidly and it is now possible to predict reasonably well the flow characteristics of gas-solid FB reactors with mono-dispersed, non-cohesive solids. This thesis is organized into seven chapters. In Chapter 2, an overview of fluidized bed polymerization reactors is given, and a simplified two-site kinetic mechanism are discussed. Some basic theories used in our work are given in detail in Chapter 3. First, the governing equations and other constitutive equations for the multi-fluid model are summarized, and the kinetic theory for describing the solid stress tensor is discussed. The detailed derivation of DQMOM for the population balance equation is given as the second section. In this section

  4. Quality assurance for computed-tomography simulators and the computed-tomography-simulation process: report of the AAPM Radiation Therapy Committee Task Group No. 66.

    Science.gov (United States)

    Mutic, Sasa; Palta, Jatinder R; Butker, Elizabeth K; Das, Indra J; Huq, M Saiful; Loo, Leh-Nien Dick; Salter, Bill J; McCollough, Cynthia H; Van Dyk, Jacob

    2003-10-01

    This document presents recommendations of the American Association of Physicists in Medicine (AAPM) for quality assurance of computed-tomography- (CT) simulators and CT-simulation process. This report was prepared by Task Group No. 66 of the AAPM Radiation Therapy Committee. It was approved by the Radiation Therapy Committee and by the AAPM Science Council.

  5. Monte Carlo computer simulation of sedimentation of charged hard spherocylinders

    Energy Technology Data Exchange (ETDEWEB)

    Viveros-Méndez, P. X., E-mail: xviveros@fisica.uaz.edu.mx; Aranda-Espinoza, S. [Unidad Académica de Física, Universidad Autónoma de Zacatecas, Calzada Solidaridad esq. Paseo, La Bufa s/n, 98060 Zacatecas, Zacatecas, México (Mexico); Gil-Villegas, Alejandro [Departamento de Ingeniería Física, División de Ciencias e Ingenierías, Campus León, Universidad de Guanajuato, Loma del Bosque 103, Lomas del Campestre, 37150 León, Guanajuato, México (Mexico)

    2014-07-28

    In this article we present a NVT Monte Carlo computer simulation study of sedimentation of an electroneutral mixture of oppositely charged hard spherocylinders (CHSC) with aspect ratio L/σ = 5, where L and σ are the length and diameter of the cylinder and hemispherical caps, respectively, for each particle. This system is an extension of the restricted primitive model for spherical particles, where L/σ = 0, and it is assumed that the ions are immersed in an structureless solvent, i.e., a continuum with dielectric constant D. The system consisted of N = 2000 particles and the Wolf method was implemented to handle the coulombic interactions of the inhomogeneous system. Results are presented for different values of the strength ratio between the gravitational and electrostatic interactions, Γ = (mgσ)/(e{sup 2}/Dσ), where m is the mass per particle, e is the electron's charge and g is the gravitational acceleration value. A semi-infinite simulation cell was used with dimensions L{sub x} ≈ L{sub y} and L{sub z} = 5L{sub x}, where L{sub x}, L{sub y}, and L{sub z} are the box dimensions in Cartesian coordinates, and the gravitational force acts along the z-direction. Sedimentation effects were studied by looking at every layer formed by the CHSC along the gravitational field. By increasing Γ, particles tend to get more packed at each layer and to arrange in local domains with an orientational ordering along two perpendicular axis, a feature not observed in the uncharged system with the same hard-body geometry. This type of arrangement, known as tetratic phase, has been observed in two-dimensional systems of hard-rectangles and rounded hard-squares. In this way, the coupling of gravitational and electric interactions in the CHSC system induces the arrangement of particles in layers, with the formation of quasi-two dimensional tetratic phases near the surface.

  6. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    Science.gov (United States)

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  7. The FachRef-Assistant: Personalised, subject specific, and transparent stock management

    Directory of Open Access Journals (Sweden)

    Eike T. Spielberg

    2017-07-01

    Full Text Available We present in this paper a personalized web application for the weeding of printed resources: the FachRef-Assistant. It offers an extensive range of tools for evidence based stock management, based on the thorough analysis of usage statistics. Special attention is paid to the criteria individualization, transparency of the parameters used, and generic functions. Currently, it is designed to work with the Aleph-System from ExLibris, but efforts were spent to keep the application as generic as possible. For example, all procedures specific to the local library system have been collected in one Java package. The inclusion of library specific properties such as collections and systematics has been designed to be highly generic as well by mapping the individual entries onto an in-memory database. Hence simple adaption of the package and the mappings would render the FachRef-Assistant compatible to other library systems. The personalization of the application allows for the inclusion of subject specific usage properties as well as of variations between different collections within one subject area. The parameter sets used to analyse the stock and to prepare weeding and purchase proposal lists are included in the output XML-files to facilitate a high degree of transparency, objectivity and reproducibility.

  8. Subject Specific Sparse Dictionary Learning for Atlas Based Brain MRI Segmentation

    Science.gov (United States)

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S.; Prince, Jerry L.; Pham, Dzung L.

    2015-01-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available, state-of-the art approaches. PMID:26340685

  9. A corpus-based lexical analysis of subject-specific university textbooks for English majors

    Directory of Open Access Journals (Sweden)

    Konul Hajiyeva

    2015-01-01

    Full Text Available This study is a corpus-based lexical analysis of subject-specific university textbooks which purports to explore lexical text coverage and frequency distribution of words from the Academic Word List and the British National Corpus frequency-based word families. For this study a 508,802-word corpus was created, the findings of which reflect that the Academic Word List word families constitute only a small coverage (6.5% of the words in the entire corpus, whereas the first two thousand high-frequency word families give the coverage of 88.92%. In terms of the text coverage, the results reveal that if 98% coverage of a text is needed for unassisted comprehension, then a vocabulary size of 9000 word families is required. The results also substantiate the claims that the Academic Word List is not as general an academic vocabulary as it was initially intended to be and, more importantly, supports the assumption that students need a more restricted core academic vocabulary. It is therefore argued that 127 academic word families which are relatively frequent in the overall university textbook corpus can be used as a part of the university word list for second-year English majors who have to read and comprehend university textbooks.

  10. Obesity and Obesity Shape Markedly Influence Spine Biomechanics: A Subject-Specific Risk Assessment Model.

    Science.gov (United States)

    Ghezelbash, Farshid; Shirazi-Adl, Aboulfazl; Plamondon, André; Arjmand, Navid; Parnianpour, Mohamad

    2017-10-01

    Underlying mechanisms of obesity-related back pain remain unexplored. Thus, we aim to determine the effect of obesity and its shapes on the spinal loads and the associated risks of injury. Obesity shapes were initially constructed by principal component analysis based on datasets on 5852 obese individuals. Spinal loads, cycles to vertebral failure and trunk stability margin were estimated in a subject-specific trunk model taking account of personalized musculature, passive ligamentous spine, obesity shapes, segmental weights, spine kinematics and bone mineral density. Three obesity shapes (mean and extreme abdominal circumferences) at three body weights (BWs) of 86, 98 and 109 kg were analyzed. Additional BW (12 kg) increased spinal loads by ~11.8%. Higher waist circumferences at identical BW increased spinal forces to the tune of ~20 kg additional BW and the risk of vertebral fatigue compression fracture by 3-7 times when compared with smaller waist circumferences. Forward flexion, greater BW and load in hands increased the trunk stability margin. Spinal loads markedly increased with BW, especially at greater waist circumferences. The risk of vertebral fatigue fracture also substantially increased at greater waist circumferences though not at smaller ones. Obesity and its shape should be considered in spine biomechanics.

  11. Subject-specific body segment parameter estimation using 3D photogrammetry with multiple cameras

    Science.gov (United States)

    Morris, Mark; Sellers, William I.

    2015-01-01

    Inertial properties of body segments, such as mass, centre of mass or moments of inertia, are important parameters when studying movements of the human body. However, these quantities are not directly measurable. Current approaches include using regression models which have limited accuracy: geometric models with lengthy measuring procedures or acquiring and post-processing MRI scans of participants. We propose a geometric methodology based on 3D photogrammetry using multiple cameras to provide subject-specific body segment parameters while minimizing the interaction time with the participants. A low-cost body scanner was built using multiple cameras and 3D point cloud data generated using structure from motion photogrammetric reconstruction algorithms. The point cloud was manually separated into body segments, and convex hulling applied to each segment to produce the required geometric outlines. The accuracy of the method can be adjusted by choosing the number of subdivisions of the body segments. The body segment parameters of six participants (four male and two female) are presented using the proposed method. The multi-camera photogrammetric approach is expected to be particularly suited for studies including populations for which regression models are not available in literature and where other geometric techniques or MRI scanning are not applicable due to time or ethical constraints. PMID:25780778

  12. Subject-specific increases in serum S-100B distinguish sports-related concussion from sports-related exertion

    National Research Council Canada - National Science Library

    Kiechle, Karin; Bazarian, Jeffrey J; Merchant-Borna, Kian; Stoecklein, Veit; Rozen, Eric; Blyth, Brian; Huang, Jason H; Dayawansa, Samantha; Kanz, Karl; Biberthaler, Peter

    2014-01-01

    .... To compare subject-specific changes in the astroglial protein, S100B, before and after SRC among collegiate and semi-professional contact sport athletes, and compare these changes to differences...

  13. Professors' and students' perceptions and experiences of computational simulations as learning tools

    Science.gov (United States)

    Magana de Leon, Alejandra De Jesus

    Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning

  14. DYNAMICS OF DETECTED FIRE FACTORS IN CLOSED COMPARTMENT: COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    V. V. Nevdakh

    2015-01-01

    Full Text Available Computer simulation of the initial fire stages in closed compartment with the volume of ≈ 60 m3 and with a burner on a floor and 2 m above floor have been carried using FDS software. Fires with different t 2 –power low heat release rates have been modeled. Fires which growth times to reach 1055 kW were 100 s and 500 s have been considered as fast and slow fires respectively. Dynamics of heat release rates and detected fire factors such as spatial distributions of air temperature, smoke obscuration and variations of indoor pressure have been studied. It has been obtained that dynamics of heat release rates of the initial fire stages in closed compartment consists of two stages. During the first stage the heat release rate is proportional to mass burning rate and flaming occurs only above a burner. At the second stage dynamics of heat release rates has a form of irregular in amplitude and duration pulsations, which are caused by self-ignition in the smoke layer. The compartment air volume may be layered with respect to the height and every layer has its oven temperature, smoke obscuration, self-ignition areas have been shown. The layer thickness, gradients of temperature and obscuration depend on a fire growth rate and on a burner height above floor have been concluded. The spatial distributions of air temperature and pressure variation have the opposite gradients on a height have been obtained. Maximal pressure variation and its gradient occurs under the fast fire with a burner on a floor have been obtained too. 

  15. The Effect of Computer Simulations on Acquisition of Knowledge and Cognitive Load: A Gender Perspective

    Science.gov (United States)

    Kaheru, Sam J.; Kriek, Jeanne

    2016-01-01

    A study on the effect of the use of computer simulations (CS) on the acquisition of knowledge and cognitive load was undertaken with 104 Grade 11 learners in four schools in rural South Africa on the physics topic geometrical optics. Owing to the lack of resources a teacher-centred approach was followed in the use of computer simulations. The…

  16. An Evaluation of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    Science.gov (United States)

    Neumann, David L.; Hood, Michelle; Neumann, Michelle M.

    2012-01-01

    In a previous report, Neumann (2010) described the use of interactive computer-based simulations in the assessment of statistical concepts. This assessment approach combined declarative knowledge of statistics with experiences in interacting with computer-based simulations. The aim of the present study was to conduct a systematic evaluation of the…

  17. Discovery Learning, Representation, and Explanation within a Computer-Based Simulation: Finding the Right Mix

    Science.gov (United States)

    Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly

    2004-01-01

    The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…

  18. Introducing Molecular Life Science Students to Model Building Using Computer Simulations

    Science.gov (United States)

    Aegerter-Wilmsen, Tinri; Kettenis, Dik; Sessink, Olivier; Hartog, Rob; Bisseling, Ton; Janssen, Fred

    2006-01-01

    Computer simulations can facilitate the building of models of natural phenomena in research, such as in the molecular life sciences. In order to introduce molecular life science students to the use of computer simulations for model building, a digital case was developed in which students build a model of a pattern formation process in…

  19. Possibilities and importance of using computer games and simulations in educational process

    OpenAIRE

    Danilović Mirčeta S.

    2003-01-01

    The paper discusses if it is possible and appropriate to use simulations (simulation games) and traditional games in the process of education. It is stressed that the terms "game" and "simulation" can and should be taken in a broader sense, although they are chiefly investigated herein as video-computer games and simulations. Any activity combining the properties of game (competition, rules, players) and the properties of simulation (i.e. operational presentation of reality) should be underst...

  20. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  1. Classical and quantum computing with C++ and Java simulations

    CERN Document Server

    Hardy, Y

    2001-01-01

    Classical and Quantum computing provides a self-contained, systematic and comprehensive introduction to all the subjects and techniques important in scientific computing. The style and presentation are readily accessible to undergraduates and graduates. A large number of examples, accompanied by complete C++ and Java code wherever possible, cover every topic. Features and benefits: - Comprehensive coverage of the theory with many examples - Topics in classical computing include boolean algebra, gates, circuits, latches, error detection and correction, neural networks, Turing machines, cryptography, genetic algorithms - For the first time, genetic expression programming is presented in a textbook - Topics in quantum computing include mathematical foundations, quantum algorithms, quantum information theory, hardware used in quantum computing This book serves as a textbook for courses in scientific computing and is also very suitable for self-study. Students, professionals and practitioners in computer...

  2. Computer simulations as tools for teaching and learning: Using a simulation environment in optics

    Science.gov (United States)

    Eylon, Bat-Sheva; Ronen, Miky; Ganiel, Uri

    1996-06-01

    RAY is a learning environment that includes a flexible ray tracing simulation, graphic tools, and task authoring facilities. This study explores RAY's potential to improve optics learning in high school. In study 1, the teacher used RAY as a "smart blackboard" with a single computer in the classroom to explore, explain, and predict optical phenomena; to introduce concepts; to interpret experiments and to represent theoretical exercises. A comparative study shows a significant effect on the spontaneous and correct use of the model by students in solving problems and a limited effect on conceptual understanding. In study 2 students, guided by written materials used the simulation individually. Students considered in a systematic manner the relationship between image formation and image observation—a major conceputal stumbling stone. They reflected on the problem-solving activity and reformulated explicity their knowledge in the domain. Case studies describe the interplay between the various aspects of the learning process in the development of conceptual understanding. A comparative study shows the importance of three factors to students' understanding of concepts and their ability to use the ray model: the computerized environment (versus written instruction of similar kind); a task design that addresses directly conceptual difficulties; and the explicit reformulation of ideas.

  3. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  4. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  5. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    OpenAIRE

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source three additional factors are varied, namely the velocity, details of the computer simulated person, and the CFD model of the wind channel. The personal exposure is found to be highly dependent on the r...

  6. Computer modeling and simulators as part of university training for NPP operating personnel

    Science.gov (United States)

    Volman, M.

    2017-01-01

    This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.

  7. Heuristics for the Buffer Allocation Problem with Collision Probability Using Computer Simulation

    National Research Council Canada - National Science Library

    Chiba, Eishi

    2015-01-01

    .... In this in-line system, we present a computer simulation method for the computation of the probability of a collision occurring. Based on this method, we try to find a buffer allocation that achieves the smallest total number of buffers under an arbitrarily specified collision probability. We also implement our proposed method and present some computational results.

  8. Personal and Simulated Computer-Aided Counseling: Perceived versus Measured Counseling Outcomes for College Students.

    Science.gov (United States)

    Fernandez, Eileen; And Others

    1986-01-01

    Examined computer-aided counseling, using a simulated computer-aided model of cognitive counseling and clients' perceived outcomes. Results indicated a group that received counseling using computers viewed their experience as less effective than did a group counseled personally; however, no differences were found on outcome measures. (Author/BL)

  9. BeeSim: Leveraging Wearable Computers in Participatory Simulations with Young Children

    Science.gov (United States)

    Peppler, Kylie; Danish, Joshua; Zaitlen, Benjamin; Glosson, Diane; Jacobs, Alexander; Phelps, David

    2010-01-01

    New technologies have enabled students to become active participants in computational simulations of dynamic and complex systems (called Participatory Simulations), providing a "first-person"perspective on complex systems. However, most existing Participatory Simulations have targeted older children, teens, and adults assuming that such concepts…

  10. Inquiry-Based Whole-Class Teaching with Computer Simulations in Physics

    Science.gov (United States)

    Rutten, Nico; van der Veen, Jan T.; van Joolingen, Wouter R.

    2015-01-01

    In this study we investigated the pedagogical context of whole-class teaching with computer simulations. We examined relations between the attitudes and learning goals of teachers and their students regarding the use of simulations in whole-class teaching, and how teachers implement these simulations in their teaching practices. We observed…

  11. Math modeling and computer mechanization for real time simulation of rotary-wing aircraft

    Science.gov (United States)

    Howe, R. M.

    1979-01-01

    Mathematical modeling and computer mechanization for real time simulation of rotary wing aircraft is discussed. Error analysis in the digital simulation of dynamic systems, such as rotary wing aircraft is described. The method for digital simulation of nonlinearities with discontinuities, such as exist in typical flight control systems and rotor blade hinges, is discussed.

  12. The Effects of 3D Computer Simulation on Biology Students' Achievement and Memory Retention

    Science.gov (United States)

    Elangovan, Tavasuria; Ismail, Zurida

    2014-01-01

    A quasi experimental study was conducted for six weeks to determine the effectiveness of two different 3D computer simulation based teaching methods, that is, realistic simulation and non-realistic simulation on Form Four Biology students' achievement and memory retention in Perak, Malaysia. A sample of 136 Form Four Biology students in Perak,…

  13. The Effects of Computer-Simulation Game Training on Participants' Opinions on Leadership Styles

    Science.gov (United States)

    Siewiorek, Anna; Gegenfurtner, Andreas; Lainema, Timo; Saarinen, Eeli; Lehtinen, Erno

    2013-01-01

    The objective of this study is to elucidate new information on the possibility of leadership training through business computer-simulation gaming in a virtual working context. In the study, a business-simulation gaming session was organised for graduate students ("n"?=?26). The participants played the simulation game in virtual teams…

  14. Integrated Computational System for Electrochemical Device Design and Simulation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Illinois Rocstar LLC proposes to develop and demonstrate the use of an integrated computational environment and infrastructure for electrochemical device design and...

  15. Simulation model of load balancing in distributed computing systems

    Science.gov (United States)

    Botygin, I. A.; Popov, V. N.; Frolov, S. G.

    2017-02-01

    The availability of high-performance computing, high speed data transfer over the network and widespread of software for the design and pre-production in mechanical engineering have led to the fact that at the present time the large industrial enterprises and small engineering companies implement complex computer systems for efficient solutions of production and management tasks. Such computer systems are generally built on the basis of distributed heterogeneous computer systems. The analytical problems solved by such systems are the key models of research, but the system-wide problems of efficient distribution (balancing) of the computational load and accommodation input, intermediate and output databases are no less important. The main tasks of this balancing system are load and condition monitoring of compute nodes, and the selection of a node for transition of the user’s request in accordance with a predetermined algorithm. The load balancing is one of the most used methods of increasing productivity of distributed computing systems through the optimal allocation of tasks between the computer system nodes. Therefore, the development of methods and algorithms for computing optimal scheduling in a distributed system, dynamically changing its infrastructure, is an important task.

  16. Simulating quantum systems on classical computers with matrix product states

    Energy Technology Data Exchange (ETDEWEB)

    Kleine, Adrian

    2010-11-08

    In this thesis, the numerical simulation of strongly-interacting many-body quantum-mechanical systems using matrix product states (MPS) is considered. Matrix-Product-States are a novel representation of arbitrary quantum many-body states. Using quantum information theory, it is possible to show that Matrix-Product-States provide a polynomial-sized representation of one-dimensional quantum systems, thus allowing an efficient simulation of one-dimensional quantum system on classical computers. Matrix-Product-States form the conceptual framework of the density-matrix renormalization group (DMRG). After a general introduction in the first chapter of this thesis, the second chapter deals with Matrix-Product-States, focusing on the development of fast and stable algorithms. To obtain algorithms to efficiently calculate ground states, the density-matrix renormalization group is reformulated using the Matrix-Product-States framework. Further, time-dependent problems are considered. Two different algorithms are presented, one based on a Trotter decomposition of the time-evolution operator, the other one on Krylov subspaces. Finally, the evaluation of dynamical spectral functions is discussed, and a correction vector-based method is presented. In the following chapters, the methods presented in the second chapter, are applied to a number of different physical problems. The third chapter deals with the existence of chiral phases in isotropic one-dimensional quantum spin systems. A preceding analytical study based on a mean-field approach indicated the possible existence of those phases in an isotropic Heisenberg model with a frustrating zig-zag interaction and a magnetic field. In this thesis, the existence of the chiral phases is shown numerically by using Matrix-Product-States-based algorithms. In the fourth chapter, we propose an experiment using ultracold atomic gases in optical lattices, which allows a well controlled observation of the spin-charge separation (of

  17. Performance evaluation of telecommunication systems using computer simulation

    Science.gov (United States)

    Tranter, W. H.

    1979-01-01

    The SYSTID time domain digital simulation technique is reviewed. A technique is illustrated which estimates the signal to noise ratio at a point in the simulation of a communication system. Signals having a lowpass or bandpass spectra are utilized. Simulation results show the technique to be accurate over a wide range of signal to noise ratios. Examples which illustrate application to both analog and digital communication systems are provided.

  18. Computer simulation of electronic excitation in atomic collision cascades

    Energy Technology Data Exchange (ETDEWEB)

    Duvenbeck, A.

    2007-04-05

    The impact of an keV atomic particle onto a solid surface initiates a complex sequence of collisions among target atoms in a near-surface region. The temporal and spatial evolution of this atomic collision cascade leads to the emission of particles from the surface - a process usually called sputtering. In modern surface analysis the so called SIMS technology uses the flux of sputtered particles as a source of information on the microscopical stoichiometric structure in the proximity of the bombarded surface spots. By laterally varying the bombarding spot on the surface, the entire target can be scanned and chemically analyzed. However, the particle detection, which bases upon deflection in electric fields, is limited to those species that leave the surface in an ionized state. Due to the fact that the ionized fraction of the total flux of sputtered atoms often only amounts to a few percent or even less, the detection is often hampered by rather low signals. Moreover, it is well known, that the ionization probability of emitted particles does not only depend on the elementary species, but also on the local environment from which a particle leaves the surface. Therefore, the measured signals for different sputtered species do not necessarily represent the stoichiometric composition of the sample. In the literature, this phenomenon is known as the Matrix Effect in SIMS. In order to circumvent this principal shortcoming of SIMS, the present thesis develops an alternative computer simulation concept, which treats the electronic energy losses of all moving atoms as excitation sources feeding energy into the electronic sub-system of the solid. The particle kinetics determining the excitation sources are delivered by classical molecular dynamics. The excitation energy calculations are combined with a diffusive transport model to describe the spread of excitation energy from the initial point of generation. Calculation results yield a space- and time-resolved excitation

  19. Simulating an aerospace multiprocessor. [for space guidance computers

    Science.gov (United States)

    Mallach, E. G.

    1976-01-01

    The paper describes a simulator which was used to evaluate the architecture of an aerospace multiprocessor. The simulator models interactions among the processors, memories, the central data bus, and a possible 'job stack'. Special features of the simulator are discussed, including the use of explicitly coded and individually distinguishable 'job models' instead of a statistically defined 'job mix' and a specialized Job Model Definition Language to automate the detailed coding of the models. Some results are presented which show that when the simulator was employed in conjunction with queuing theory and Markov-process analysis, more insight into system behavior was obtained than would have been with any one technique alone.

  20. Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E

    2017-01-01

    Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach to the de......Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...

  1. Subject-specific bone attenuation correction for brain PET/MR: can ZTE-MRI substitute CT scan accurately?

    Science.gov (United States)

    Khalifé, Maya; Fernandez, Brice; Jaubert, Olivier; Soussan, Michael; Brulon, Vincent; Buvat, Irène; Comtat, Claude

    2017-10-01

    In brain PET/MR applications, accurate attenuation maps are required for accurate PET image quantification. An implemented attenuation correction (AC) method for brain imaging is the single-atlas approach that estimates an AC map from an averaged CT template. As an alternative, we propose to use a zero echo time (ZTE) pulse sequence to segment bone, air and soft tissue. A linear relationship between histogram normalized ZTE intensity and measured CT density in Hounsfield units (HU ) in bone has been established thanks to a CT-MR database of 16 patients. Continuous AC maps were computed based on the segmented ZTE by setting a fixed linear attenuation coefficient (LAC) to air and soft tissue and by using the linear relationship to generate continuous μ values for the bone. Additionally, for the purpose of comparison, four other AC maps were generated: a ZTE derived AC map with a fixed LAC for the bone, an AC map based on the single-atlas approach as provided by the PET/MR manufacturer, a soft-tissue only AC map and, finally, the CT derived attenuation map used as the gold standard (CTAC). All these AC maps were used with different levels of smoothing for PET image reconstruction with and without time-of-flight (TOF). The subject-specific AC map generated by combining ZTE-based segmentation and linear scaling of the normalized ZTE signal into HU was found to be a good substitute for the measured CTAC map in brain PET/MR when used with a Gaussian smoothing kernel of 4~mm corresponding to the PET scanner intrinsic resolution. As expected TOF reduces AC error regardless of the AC method. The continuous ZTE-AC performed better than the other alternative MR derived AC methods, reducing the quantification error between the MRAC corrected PET image and the reference CTAC corrected PET image.

  2. Prediction of the structural response of the femoral shaft under dynamic loading using subject-specific finite element models.

    Science.gov (United States)

    Park, Gwansik; Kim, Taewung; Forman, Jason; Panzer, Matthew B; Crandall, Jeff R

    2017-08-01

    The goal of this study was to predict the structural response of the femoral shaft under dynamic loading conditions using subject-specific finite element (SS-FE) models and to evaluate the prediction accuracy of the models in relation to the model complexity. In total, SS-FE models of 31 femur specimens were developed. Using those models, dynamic three-point bending and combined loading tests (bending with four different levels of axial compression) of bare femurs were simulated, and the prediction capabilities of five different levels of model complexity were evaluated based on the impact force time histories: baseline, mass-based scaled, structure-based scaled, geometric SS-FE, and heterogenized SS-FE models. Among the five levels of model complexity, the geometric SS-FE and the heterogenized SS-FE models showed statistically significant improvement on response prediction capability compared to the other model formulations whereas the difference between two SS-FE models was negligible. This result indicated the geometric SS-FE models, containing detailed geometric information from CT images with homogeneous linear isotropic elastic material properties, would be an optimal model complexity for prediction of structural response of the femoral shafts under the dynamic loading conditions. The average and the standard deviation of the RMS errors of the geometric SS-FE models for all the 31 cases was 0.46 kN and 0.66 kN, respectively. This study highlights the contribution of geometric variability on the structural response variation of the femoral shafts subjected to dynamic loading condition and the potential of geometric SS-FE models to capture the structural response variation of the femoral shafts.

  3. Evaluation of SHABERTH: A bearing simulation computer program

    Science.gov (United States)

    1978-01-01

    To investigate lubrication effects on bearing thermal performance, an investigation was performed to determine the feasibility of using the SKF program SHABERTH for simulating the performance of cryogenically lubricated ball bearings. As a part of this study, the particular application chosen for SHABERTH was to simulate the performance of the Space Shuttle main engine turbo-pump and pre-burner bearing system.

  4. Computer simulation of metal-on-metal epitaxy

    NARCIS (Netherlands)

    Breeman, M; Barkema, G.T.; Langelaar, M.H; Boerma, D.O

    1996-01-01

    Atom-embedding calculations and Monte Carlo simulations are presented on various topics which are important for a better understanding of homoepitaxial growth of metals. Results of realistic simulations of the homoepitaxial growth on Cu(100) are presented for the submonolayer regime as well as for

  5. Computer simulation of white pine blister rust epidemics

    Science.gov (United States)

    Geral I. McDonald; Raymond J. Hoff; William R. Wykoff

    1981-01-01

    A simulation of white pine blister rust is described in both word and mathematical models. The objective of this first generation simulation was to organize and analyze the available epidemiological knowledge to produce a foundation for integrated management of this destructive rust of 5-needle pines. Verification procedures and additional research needs are also...

  6. PENGEMBANGAN SUBJECT SPECIFIC PEDAGOGY (SSP IPA TERPADU UNTUK MENINGKATKAN HASIL BELAJAR SISWA

    Directory of Open Access Journals (Sweden)

    Fitri Yuliawati

    2016-05-01

    Full Text Available Background of this research is the analysis of the results of interviews some junior secondary schools in Yogyakarta and the conclusion that teachers do not use an integrated science teaching and teachers still find difficulties in the application of learning science in an integrated manner. It can be influenced by several factors, such as the lack of reference used by teachers in presenting the material Integrated Science relevantly, most of science teachers are from educational background of chemistry, physics, and biology instead of science education, so that teachers find difficulties to create an integrated learning of science. In addition, teachers feel difficulty in determining the depth of the material, limits of integration in integrated science teaching, and did not know the concept of integrated science teaching. This research is a development research. Learning tools developed included : student books, lesson plans, student activity sheets and evaluation tools. The development of the learning which is done in this study use the models of 4D development which includes the step of Define (definition which at this stage conducted a needs analysis. Design : it is the stage of Subject Specific Pedagogy (SSP software design. Development, it is the stage of development after the draft was made followed by a learning device validation by experts. This stage is also conducted to seek input from all the responses, reactions and comments from teachers, students, and observers so that it can be used for further improvement of science teaching later. The Disseminate, it is the stage of field tests are widely but not done. Data collection instruments used in this study include : test items and questionnaire. The conclusion of this development research are as follows : the results of the validation SSP integrated science by learning tools expert, material experts and media experts indicate the category of Very Good (SB so that SSP integrated

  7. A subject-specific framework for in vivo myeloarchitectonic analysis using high resolution quantitative MRI.

    Science.gov (United States)

    Waehnert, Miriam D; Dinse, Juliane; Schäfer, Andreas; Geyer, Stefan; Bazin, Pierre-Louis; Turner, Robert; Tardif, Christine Lucas

    2016-01-15

    Structural magnetic resonance imaging can now resolve laminar features within the cerebral cortex in vivo. A variety of intracortical contrasts have been used to study the cortical myeloarchitecture with the purpose of mapping cortical areas in individual subjects. In this article, we first briefly review recent advances in MRI analysis of cortical microstructure to portray the potential and limitations of the current state-of-the-art. We then present an integrated framework for the analysis of intracortical structure, composed of novel image processing tools designed for high resolution cortical images. The main features of our framework are the segmentation of quantitative T1 maps to delineate the cortical boundaries (Bazin et al., 2014), and the use of an equivolume layering model to define an intracortical coordinate system that follows the anatomical layers of the cortex (Waehnert et al., 2014). We evaluate the framework with 150μm isotropic post mortem T2(∗)-weighted images and 0.5mm isotropic in vivo T1 maps, a quantitative index of myelin content. We study the laminar structure of the primary visual cortex (Brodmann area 17) in the post mortem and in vivo data, as well as the central sulcus region in vivo, in particular Brodmann areas 1, 3b and 4. We also investigate the impact of the layering models on the relationship between T1 and cortical curvature. Our experiments demonstrate that the equivolume intracortical surfaces and transcortical profiles best reflect the laminar structure of the cortex in areas of curvature in comparison to the state-of-the-art equidistant and Laplace implementations. This framework generates a subject specific intracortical coordinate system, the basis for subsequent architectonic analyses of the cortex. Any structural or functional contrast co-registered to the T1 maps, used to segment the cortex, can be sampled on the curved grid for analysis. This work represents an important step towards in vivo structural brain mapping

  8. Efficiency using computer simulation of Reverse Threshold Model Theory on assessing a “One Laptop Per Child” computer versus desktop computer

    OpenAIRE

    Supat Faarungsang; Sasithon Nakthong

    2017-01-01

    The Reverse Threshold Model Theory (RTMT) model was introduced based on limiting factor concepts, but its efficiency compared to the Conventional Model (CM) has not been published. This investigation assessed the efficiency of RTMT compared to CM using computer simulation on the “One Laptop Per Child” computer and a desktop computer. Based on probability values, it was found that RTMT was more efficient than CM among eight treatment combinations and an earlier study verified that RTMT gives c...

  9. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhixin Li

    2017-01-01

    Full Text Available Synthetic Aperture Radar (SAR raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC methods have demonstrated their potential for accelerating simulation, the input/output (I/O bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  10. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    Science.gov (United States)

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  11. Computer Simulation and Operating Characteristics of a Three-Phase Brushless Synchronous Generator

    OpenAIRE

    Cingoski, Vlatko; Mikami, Mitsuru; Inoue, Kenji; Kaneda, Kazufumi; Yamashita, Hideo

    1998-01-01

    This paper deals with numerical computation and simulation of the operating conditions of a three-phase brushless synchronous generator. A voltage driven nonlinear time-periodic finite element analysis is utilized to compute accurately the magnetic field distribution and the induced voltage and currents. The computation procedure is briefly addressed followed by the computed results and their comparison with experimental ones. The agreement between results is very good verifying the computati...

  12. Computer-simulated fluid dynamics of arterial perfusion in extracorporeal circulation: From reality to virtual simulation.

    Science.gov (United States)

    Fukuda, Ikuo; Osanai, Satoshi; Shirota, Minori; Inamura, Takao; Yanaoka, Hideki; Minakawa, Masahito; Fukui, Kozo

    2009-06-01

    Atheroembolism due to aortic manipulation remains an unsolved problem in surgery for thoracic aortic aneurysm. The goal of the present study is to create a computer simulation (CS) model with which to analyze blood flow in the diseased aorta. A three-dimensional glass model of the aortic arch was constructed from CT images of a normal, healthy person and a patient with transverse aortic arch aneurysm. Separately, a CS model of the curved end-hole cannula was created, and flow from the aortic cannula was recreated using a numerical simulation. Comparison of the data obtained by the glass model analyses revealed that the flow velocity and the vector of the flow around the exit of the cannula were similar to that in the CS model. A high-velocity area was observed around the cannula exit in both the glass model and the CS model. The maximum flow velocity was as large as 1.0 m/s at 20 mm from the cannula exit and remained as large as 0.5 to 0.6 m/s within 50 mm of the exit. In the aortic arch aneurysm models, the rapid jet flow from the cannula moved straight toward the lesser curvature of the transverse aortic arch. The locations and intensities of the calculated vortices were slightly different from those obtained for the glass model. The proposed CS method for the analysis of blood flow from the aortic cannulae during extracorporeal circulation can reproduce the flow velocity and flow pattern in the proximal and transverse aortic arches.

  13. A computational method to model radar return range in a polygonally based, computer-generated-imagery simulation

    Science.gov (United States)

    Moran, F. J.; Phillips, J. D.

    1986-01-01

    Described is a method for modeling a ground-mapping radar system for use in simulations where the terrain is in a polygonal form commonly used with computer generated imagery (CGI). The method employs a unique approach for rapidly rejecting polygons not visible to the radar to facilitate the real-time simulation of the radar return. This rapid rejection of the nonvisible polygons requires the precalculation and storage of a set of parameters that do not vary during the simulation. The calculation of a radar range as a function of the radar forward-looking angle to the CGI terrain is carried out only for the visible polygons. This method was used as part of a simulation for terrain-following helicopter operations on the vertical motion simulator at the NASA Ames Research Center. It proved to be an efficient means for returning real-time simulated radar range data.

  14. Developmental Dynamics of General and School-Subject-Specific Components of Academic Self-Concept, Academic Interest, and Academic Anxiety

    Science.gov (United States)

    Gogol, Katarzyna; Brunner, Martin; Preckel, Franzis; Goetz, Thomas; Martin, Romain

    2016-01-01

    The present study investigated the developmental dynamics of general and subject-specific (i.e., mathematics, French, and German) components of students' academic self-concept, anxiety, and interest. To this end, the authors integrated three lines of research: (a) hierarchical and multidimensional approaches to the conceptualization of each construct, (b) longitudinal analyses of bottom-up and top-down developmental processes across hierarchical levels, and (c) developmental processes across subjects. The data stemmed from two longitudinal large-scale samples (N = 3498 and N = 3863) of students attending Grades 7 and 9 in Luxembourgish schools. Nested-factor models were applied to represent each construct at each grade level. The analyses demonstrated that several characteristics were shared across constructs. All constructs were multidimensional in nature with respect to the different subjects, showed a hierarchical organization with a general component at the apex of the hierarchy, and had a strong separation between the subject-specific components at both grade levels. Further, all constructs showed moderate differential stabilities at both the general (0.42 < r < 0.55) and subject-specific levels (0.45 < r < 0.73). Further, little evidence was found for top-down or bottom-up developmental processes. Rather, general and subject-specific components in Grade 9 proved to be primarily a function of the corresponding components in Grade 7. Finally, change in several subject-specific components could be explained by negative effects across subjects. PMID:27014162

  15. Developmental Dynamics of General and School-Subject-Specific Components of Academic Self-Concept, Academic Interest, and Academic Anxiety

    Directory of Open Access Journals (Sweden)

    Katarzyna eGogol

    2016-03-01

    Full Text Available The present study investigated the developmental dynamics of general and subject-specific (i.e., mathematics, French, and German components of students’ academic self-concept, anxiety, and interest. To this end, the authors integrated three lines of research: (a hierarchical and multidimensional approaches to the conceptualization of each construct, (b longitudinal analyses of bottom-up and top-down developmental processes across hierarchical levels, and (c ipsative developmental processes across subjects. The data stemmed from two longitudinal large-scale samples (N = 3,498 and N = 3,863 of students attending Grades 7 and 9 in Luxembourgish schools. Nested-factor models were applied to represent each construct at each grade level. The analyses demonstrated that several characteristics were shared across constructs. All constructs were multidimensional in nature with respect to the different subjects, showed a hierarchical organization with a general component at the apex of the hierarchy, and had a strong separation between the subject-specific components at both grade levels. Further, all constructs showed moderate differential stabilities at both the general (.42 < r < .55 and subject-specific levels (.45 < r < .73. Further, little evidence was found for top-down or bottom-up developmental processes. Rather, general and subject-specific components in Grade 9 proved to be primarily a function of the corresponding components in Grade 7. Finally, change in several subject-specific components could be explained by negative, ipsative effects across subjects.

  16. Product Representation to support validation of simulation models in Computer aided engineering

    OpenAIRE

    Kain, Andreas;Gaag, Andreas;Lindemann, Udo

    2017-01-01

    Computer aided engineering (CAE) provides proper means to support New Product Development (NPD) by simulation tools. Simulation furthers early identification of product characteristics to reduce costs and time. The applicability of simulation models in NPD strongly depends on their validity, thus validating a simulation poses a major issue to provide correct experimentation results. The authors propose a matrix based approach to combine solution neutral system representation, solution specifi...

  17. Reduction of artifacts in computer simulation of breast Cooper's ligaments

    Science.gov (United States)

    Pokrajac, David D.; Kuperavage, Adam; Maidment, Andrew D. A.; Bakic, Predrag R.

    2016-03-01

    Anthropomorphic software breast phantoms have been introduced as a tool for quantitative validation of breast imaging systems. Efficacy of the validation results depends on the realism of phantom images. The recursive partitioning algorithm based upon the octree simulation has been demonstrated as versatile and capable of efficiently generating large number of phantoms to support virtual clinical trials of breast imaging. Previously, we have observed specific artifacts, (here labeled "dents") on the boundaries of simulated Cooper's ligaments. In this work, we have demonstrated that these "dents" result from the approximate determination of the closest simulated ligament to an examined subvolume (i.e., octree node) of the phantom. We propose a modification of the algorithm that determines the closest ligament by considering a pre-specified number of neighboring ligaments selected based upon the functions that govern the shape of ligaments simulated in the subvolume. We have qualitatively and quantitatively demonstrated that the modified algorithm can lead to elimination or reduction of dent artifacts in software phantoms. In a proof-of concept example, we simulated a 450 ml phantom with 333 compartments at 100 micrometer resolution. After the proposed modification, we corrected 148,105 dents, with an average size of 5.27 voxels (5.27nl). We have also qualitatively analyzed the corresponding improvement in the appearance of simulated mammographic images. The proposed algorithm leads to reduction of linear and star-like artifacts in simulated phantom projections, which can be attributed to dents. Analysis of a larger number of phantoms is ongoing.

  18. A scalable parallel black oil simulator on distributed memory parallel computers

    Science.gov (United States)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  19. Enabling Breakthrough Kinetic Simulations of the Magnetosphere Using Petascale Computing

    Science.gov (United States)

    Vu, H. X.; Karimabadi, H.; Omelchenko, Y.; Tatineni, M.; Majumdar, A.; Krauss-Varban, D.; Dorelli, J.

    2009-12-01

    Currently global magnetospheric simulations are predominantly based on single-fluid magnetohydrodynamics (MHD). MHD simulations have proven useful in studies of the global dynamics of the magnetosphere with the goal of predicting eminent features of substorms and other global events. But it is well known that the magnetosphere is dominated by ion kinetic effects, which is ignored in MHD simulations, and many key aspects of the magnetosphere relating to transport and structure of boundaries await global kinetic simulations. We are using our recent innovations in hybrid (electron fluid, kinetic ions) simulations, as being developed in our Hybrid3D (H3D) code, and the power of massively parallel machines to make, breakthrough 3D global kinetic simulations of the magnetosphere. The innovations include (i) multi-zone (asynchronous) algorithm, (ii) dynamic load balancing, and (iii) code adaptation and optimization to large number of processors. In this presentation we will show preliminary results of our progress to date using from 512 to over 8192 cores. In particular, we focus on what we believe to be the first demonstration of the formation of a flux rope in 3D global hybrid simulations. As in the MHD simulations, the resulting flux rope has a very complex structure, wrapping up field lines from different regions and appears to be connected on at least one end to Earth. Magnetic topology of the FTE is examined to reveal the existence of several separators (3D X-lines). The formation and growth of this structure will be discussed and spatial profile of the magnetic and plasma variables will be compared with those from MHD simulations.

  20. Computer modeling and simulation of human movement. Applications in sport and rehabilitation.

    Science.gov (United States)

    Neptune, R R

    2000-05-01

    Computer modeling and simulation of human movement plays an increasingly important role in sport and rehabilitation, with applications ranging from sport equipment design to understanding pathologic gait. The complex dynamic interactions within the musculoskeletal and neuromuscular systems make analyzing human movement with existing experimental techniques difficult but computer modeling and simulation allows for the identification of these complex interactions and causal relationships between input and output variables. This article provides an overview of computer modeling and simulation and presents an example application in the field of rehabilitation.

  1. Computational models of protein kinematics and dynamics: beyond simulation.

    Science.gov (United States)

    Gipson, Bryant; Hsu, David; Kavraki, Lydia E; Latombe, Jean-Claude

    2012-01-01

    Physics-based simulation represents a powerful method for investigating the time-varying behavior of dynamic protein systems at high spatial and temporal resolution. Such simulations, however, can be prohibitively difficult or lengthy for large proteins or when probing the lower-resolution, long-timescale behaviors of proteins generally. Importantly, not all questions about a protein system require full space and time resolution to produce an informative answer. For instance, by avoiding the simulation of uncorrelated, high-frequency atomic movements, a larger, domain-level picture of protein dynamics can be revealed. The purpose of this review is to highlight the growing body of complementary work that goes beyond simulation. In particular, this review focuses on methods that address kinematics and dynamics, as well as those that address larger organizational questions and can quickly yield useful information about the long-timescale behavior of a protein.

  2. Strategic Implications of Cloud Computing for Modeling and Simulation (Briefing)

    Science.gov (United States)

    2016-04-01

    latter category than the former. In contrast to classic IT applications, M&S applications tend to use central processing unit ( CPU ) more intensely...SCS M&S Magazine, 2010.  Henninger, A., Scrudder, R., Riggs, W., Wall, J., and Williams, K. (2016). “A Functional Deep Dive on Two Simulations...Methodology, Results and Lessons Learned ” To be published in the Proceedings of the ’16 Interservice/Industry Training, Simulation and Education

  3. Computer simulation of the fire-tube boiler hydrodynamics

    Directory of Open Access Journals (Sweden)

    Khaustov Sergei A.

    2015-01-01

    Full Text Available Finite element method was used for simulating the hydrodynamics of fire-tube boiler with the ANSYS Fluent 12.1.4 engineering simulation software. Hydrodynamic structure and volumetric temperature distribution were calculated. The results are presented in graphical form. Complete geometric model of the fire-tube boiler based on boiler drawings was considered. Obtained results are suitable for qualitative analysis of hydrodynamics and singularities identification in fire-tube boiler water shell.

  4. La Granja: A Beowulf type Computer for Numerical Simulations in Stellar and Galactic Dynamics

    Science.gov (United States)

    Velázquez, H.; Aguilar, L. A.

    We present a Beowulf type computer built using off-the-shelf hardware and freely available software. Its performance in raw computational power and parallel efficiency is compared with an SGI Origin-2000 computer using two different N-body codes. The impact of this technology in opening up the possibility of making routine N ~ 10^6 particle simulations with a ``home made'' computer is discussed. The effect of higher numerical resolution is shown with simulations of a cold dissipationless collapse and of the vertical heating of the disk component of a spiral galaxy evolving in isolation.

  5. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    Science.gov (United States)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  6. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  7. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  8. Computer simulations of the atmospheric composition climate of Bulgaria

    Energy Technology Data Exchange (ETDEWEB)

    Gadzhev, G.; Ganev, K.; Syrkov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.

    2015-07-01

    Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 “Integrated Process Rate Analysis” option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment – calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)

  9. Computer simulations of the atmospheric composition climate of Bulgaria

    Energy Technology Data Exchange (ETDEWEB)

    Gadzhev, G.; Ganev, K.; Syrakov, D.; Prodanova, M.; Georgieva, I.; Georgiev, G.

    2015-07-01

    Some extensive numerical simulations of the atmospheric composition fields in Bulgaria have been recently performed. The US EPA Model-3 system was chosen as a modelling tool. As the NCEP Global Analysis Data with 1 degree resolution was used as meteorological background, the MM5 and CMAQ nesting capabilities were applied for downscaling the simulations to a 3 km resolution over Bulgaria. The TNO emission inventory was used as emission input. Special pre-processing procedures are created for introducing temporal profiles and speciation of the emissions. The biogenic emissions of VOC are estimated by the model SMOKE. The simulations were carried out for years 2000-2007. The numerical experiments have been carried out for different emission scenarios, which makes it possible the contribution of emissions from different source categories to be evaluated. The Models-3 Integrated Process Rate Analysis option is applied to discriminate the role of different dynamic and chemical processes for the air pollution formation. The obtained ensemble of numerical simulation results is extensive enough to allow statistical treatment calculating not only the mean concentrations and different source categories contribution mean fields, but also standard deviations, skewness, etc. with their dominant temporal modes (seasonal and/or diurnal variations). Thus some basic facts about the atmospheric composition climate of Bulgaria can be retrieved from the simulation ensemble. (Author)

  10. Simultaneous computation within a sequential process simulation tool

    Directory of Open Access Journals (Sweden)

    G. Endrestøl

    1989-10-01

    Full Text Available The paper describes an equation solver superstructure developed for a sequential modular dynamic process simulation system as part of a Eureka project with Norwegian and British participation. The purpose of the development was combining some of the advantages of equation based and purely sequential systems, enabling implicit treatment of key variables independent of module boundaries, and use of numerical integration techniques suitable for each individual type of variable. For training simulator applications the main advantages are gains in speed due to increased stability limits on time steps and improved consistency of simulation results. The system is split into an off-line analysis phase and an on-line equation solver. The off-line processing consists of automatic determination of the topological structure of the system connectivity from standard process description files and derivation of an optimized sparse matrix solution procedure for the resulting set of equations. The on-line routine collects equation coefficients from involved modules, solves the combined sets of structured equations, and stores the results appropriately. This method minimizes the processing cost during the actual simulation. The solver has been applied in the Veslefrikk training simulator project.

  11. Criteria for Appraising Computer-Based Simulations for Teaching Arabic as a Foreign Language

    National Research Council Canada - National Science Library

    Dabrowski, Richard

    2005-01-01

    This was an exploratory study aimed at defining more sharply the pedagogical and practical challenges entailed in designing and creating computer-based game-types simulations for learning Arabic as a foreign language...

  12. Computer-Based Virtual Reality Colonoscopy Simulation Improves Patient-Based Colonoscopy Performance

    Directory of Open Access Journals (Sweden)

    Keith S McIntosh

    2014-01-01

    Full Text Available BACKGROUND: Colonoscopy simulators that enable one to perform computer-based virtual colonoscopy now exist. However, data regarding the effectiveness of this virtual training are limited.

  13. Computational and Simulation Modeling of Political Attitudes: The 'Tiger' Area of Political Culture Research

    National Research Council Canada - National Science Library

    Voinea, Camelia Florela

    2016-01-01

    ...” – of political culture modeling research. This paper reviews the research literature on the conceptual, computational and simulation modeling of political attitudes developed starting with the beginning of the 20th century until the present times...

  14. Purtscher retinopathy: an alternative etiology supported by computer fluid dynamic simulations.

    Science.gov (United States)

    Harrison, Thomas J; Abbasi, Cyrus O; Khraishi, Tariq A

    2011-10-11

    To explore an alternative etiology for Purtscher retinopathy by literature review and fluid dynamic computational simulations of wall shear stress (WSS) profiles. Computer simulations were developed, incorporating posterior pole retinal microvascular flow parameters, to demonstrate WSS profiles at 90° and 45° angle artery/arteriolar branching. Computer simulations reveal WSS profiles dependent on artery/arteriolar branching angles. At high flow rates an area of changed WSS and flow swirling and reversal was noted at the proximal fillet of the 90° arteriolar branching. These changes did not appear at the 45° arteriolar branching until the flow rate was increased an additional 30%. Computer simulation data, as well as review of the history and clinical findings of Purtscher and Purtscher-like retinopathy, present evidence that an additional etiology for Purtscher retinopathy may be a rheological event at a retinal posterior pole foci of vascular endothelial dysregulation, followed by downstream endothelin-induced vasculopathy.

  15. Enhanced Computer Aided Simulation of Meshing and Contact With Application for Spiral Bevel Gear Drives

    National Research Council Canada - National Science Library

    Litvin, F

    1999-01-01

    An integrated tooth contact analysis (TCA) computer program for the simulation of meshing and contact of gear drives that calculates transmission errors and shift of hearing contact for misaligned gear drives has been developed...

  16. Simulation of partially coherent light propagation using parallel computing devices

    Science.gov (United States)

    Magalhães, Tiago C.; Rebordão, José M.

    2017-08-01

    Light acquires or loses coherence and coherence is one of the few optical observables. Spectra can be derived from coherence functions and understanding any interferometric experiment is also relying upon coherence functions. Beyond the two limiting cases (full coherence or incoherence) the coherence of light is always partial and it changes with propagation. We have implemented a code to compute the propagation of partially coherent light from the source plane to the observation plane using parallel computing devices (PCDs). In this paper, we restrict the propagation in free space only. To this end, we used the Open Computing Language (OpenCL) and the open-source toolkit PyOpenCL, which gives access to OpenCL parallel computation through Python. To test our code, we chose two coherence source models: an incoherent source and a Gaussian Schell-model source. In the former case, we divided into two different source shapes: circular and rectangular. The results were compared to the theoretical values. Our implemented code allows one to choose between the PyOpenCL implementation and a standard one, i.e using the CPU only. To test the computation time for each implementation (PyOpenCL and standard), we used several computer systems with different CPUs and GPUs. We used powers of two for the dimensions of the cross-spectral density matrix (e.g. 324, 644) and a significant speed increase is observed in the PyOpenCL implementation when compared to the standard one. This can be an important tool for studying new source models.

  17. Computational framework for simulating fluorescence microscope images with cell populations.

    Science.gov (United States)

    Lehmussola, Antti; Ruusuvuori, Pekka; Selinummi, Jyrki; Huttunen, Heikki; Yli-Harja, Olli

    2007-07-01

    Fluorescence microscopy combined with digital imaging constructs a basic platform for numerous biomedical studies in the field of cellular imaging. As the studies relying on analysis of digital images have become popular, the validation of image processing methods used in automated image cytometry has become an important topic. Especially, the need for efficient validation has arisen from emerging high-throughput microscopy systems where manual validation is impractical. We present a simulation platform for generating synthetic images of fluorescence-stained cell populations with realistic properties. Moreover, we show that the synthetic images enable the validation of analysis methods for automated image cytometry and comparison of their performance. Finally, we suggest additional usage scenarios for the simulator. The presented simulation framework, with several user-controllable parameters, forms a versatile tool for many kinds of validation tasks, and is freely available at http://www.cs.tut.fi/sgn/csb/simcep.

  18. From Architectural Acoustics to Acoustical Architecture Using Computer Simulation

    DEFF Research Database (Denmark)

    Schmidt, Anne Marie Due; Kirkegaard, Poul Henning

    2005-01-01

    to the design of Bagsvaerd Church by Jørn Utzon. The paper discusses the advantages and disadvantages of the programme in each phase compared to the works of architects not using acoustic simulation programmes. The conclusion of the paper points towards the need to apply the acoustic simulation programmes...... properties prior to the actual construction of a building. With the right tools applied, acoustic design can become an integral part of the architectural design process. The aim of this paper is to investigate the field of application that an acoustic simulation programme can have during an architectural...... acoustic design process and to set up a strategy to develop future programmes. The emphasis is put on the first three out of four phases in the working process of the architect and a case study is carried out in which each phase is represented by typical results ? as exemplified with reference...

  19. Computer simulation of RBS spectra from samples with surface roughness

    Energy Technology Data Exchange (ETDEWEB)

    Malinský, P., E-mail: malinsky@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Department of Physics, Faculty of Science, J. E. Purkinje University, Ceske mladeze 8, 400 96 Usti nad Labem (Czech Republic); Hnatowicz, V., E-mail: hnatowicz@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Macková, A., E-mail: mackova@ujf.cas.cz [Nuclear Physics Institute of the Academy of Sciences of the Czech Republic, v. v. i., 250 68 Rez (Czech Republic); Department of Physics, Faculty of Science, J. E. Purkinje University, Ceske mladeze 8, 400 96 Usti nad Labem (Czech Republic)

    2016-03-15

    A fast code for the simulation of common RBS spectra including surface roughness effects has been written and tested on virtual samples comprising either a rough layer deposited on a smooth substrate or smooth layer deposited on a rough substrate and simulated at different geometries. The sample surface or interface relief has been described by a polyline and the simulated RBS spectrum has been obtained as the sum of many particular spectra from randomly chosen particle trajectories. The code includes several procedures generating virtual samples with random and regular (periodical) roughness. The shape of the RBS spectra has been found to change strongly with increasing sample roughness and an increasing angle of the incoming ion beam.

  20. Computational simulations of copper complexes relevant to Alzheimer's disease

    Science.gov (United States)

    Alí-Torres, Jorge; Marechal, Jean-Didier; Mirats, Andrea; Rodríguez-Rodríguez, Cristina; Rodríguez-Santiago, Luis; Sodupe, Mariona

    2014-10-01

    Metal cations such as Cu2+ have been shown to induce amyloid aggregation and formation of reactive oxygen species. Elucidation of the structural features of Cu2+-Aβ is thus, essential to understand their role in the aggregation of Aβ, formation of ROS and to rationally design new chelators with potential therapeutic applications. Present contribution reviews our computational studies in this field. First, computational strategies used to determine three dimensional structures for Cu2+Aβ(1-16) and the redox properties of these complexes will be discussed and second, we will summarize our recent studies on Cu2+ chelators.

  1. Role Of Computational Simulations In The Design Of Piston Rings

    Directory of Open Access Journals (Sweden)

    Novotný Pavel

    2015-06-01

    Full Text Available The paper presents computational approaches using modern strategies for a dynamic piston ring solution as a fluid structural problem. Computational model outputs can be used to understand design parameter influences on defined results of a primarily integral character. Piston ring dynamics incorporates mixed lubrication conditions, the influence of surface roughness on oil film lubrication, the influence of ring movement on gas dynamics, oil film formulation on a cylinder liner and other significant influences. The solution results are presented for several parameters of SI engine piston rings.

  2. Improved three-dimensional nonlinear computer simulation for TWTs

    CERN Document Server

    Xu, Lin; Mo Yuan Long

    1999-01-01

    The paper covers 3D nonlinear analysis for TWTs. Based on a macro- particle model, the electron beam can be subdivided into 3D macro- particles to calculate space-charge forces using Green's function methods, and 3D large-signal $9 working equations are obtained. The numerical results for a uniform magnetic focusing field indicate that, in 3D numerical analysis, 3D space-charge forces can be substituted by 2D forces with little influence on the numerical $9 results, which greatly decreases computing time so that a 3D computer program can be easily used. (7 refs).

  3. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  4. A study of workstation computational performance for real-time flight simulation

    Science.gov (United States)

    Maddalon, Jeffrey M.; Cleveland, Jeff I., II

    1995-01-01

    With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.

  5. Research of Computer Simulation of Reverberation Time in Classroom

    Science.gov (United States)

    Daheng, Yang; Qi, Li

    Reverberation time is a major factor which affects the auditory environment in the classroom. Solution and shape of the classroom can be determined by simulation software, which meets the legislation of reverberation time. Simulation can be made by SABINE, NOR-ER and MIL-SE reverberation calculation formula under different conditions. With the condition of empty, 2/3 and full occupation of a medium classroom, reverberation time was calculated by these three formulas. Results of calculation were compared with former experimental data. Difference and applicability of reverberation formulas are analyzed. The auditory environment of classroom can be improved by such results at the design solution stage.

  6. Computer simulation of ion beam analysis of laterally inhomogeneous materials

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, M.

    2016-03-15

    The program STRUCTNRA for the simulation of ion beam analysis charged particle spectra from arbitrary two-dimensional distributions of materials is described. The code is validated by comparison to experimental backscattering data from a silicon grating on tantalum at different orientations and incident angles. Simulated spectra for several types of rough thin layers and a chessboard-like arrangement of materials as example for a multi-phase agglomerate material are presented. Ambiguities between back-scattering spectra from two-dimensional and one-dimensional sample structures are discussed.

  7. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist.

    Directory of Open Access Journals (Sweden)

    Brian Drawert

    2016-12-01

    Full Text Available We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.

  8. SNOW: a digital computer program for the simulation of ion beam devices

    Energy Technology Data Exchange (ETDEWEB)

    Boers, J.E.

    1980-08-01

    A digital computer program, SNOW, has been developed for the simulation of dense ion beams. The program simulates the plasma expansion cup (but not the plasma source itself), the acceleration region, and a drift space with neutralization if desired. The ion beam is simulated by computing representative trajectories through the device. The potentials are simulated on a large rectangular matrix array which is solved by iterative techniques. Poisson's equation is solved at each point within the configuration using space-charge densities computed from the ion trajectories combined with background electron and/or ion distributions. The simulation methods are described in some detail along with examples of both axially-symmetric and rectangular beams. A detailed description of the input data is presented.

  9. Comparisons between Computer Simulations of Room Acoustical Parameters and those Measured in Concert Halls

    DEFF Research Database (Denmark)

    Rindel, Jens Holger; Shiokawa, Hiroyoshi; Christensen, Claus Lynge

    1999-01-01

    A number of European concert halls were surveyed in 1989. In this paper comparisons are made between measured room acoustical parameters and those obtained from computer simulations using the ODEON program version 3.1 on two concert halls. One is Musikverein in Vienna and the other is Concertgebouw...... in Amsterdam. Comparisons are also made between the results obtained from computer simulations using models with high geometrical fidelity and those from models with simplifications to geometry on both concert halls....

  10. Application of High Performance Computing for Simulations of N-Dodecane Jet Spray with Evaporation

    Science.gov (United States)

    2016-11-01

    multicomponent fluids such as n-dodecane. The long-term goal of this research is to incorporate these models into future simulations of turbulent jet...performance computing, fuel, spray, large eddy simulation, computational fluid dynamics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR...Research Experience Prior to this summer’s internship, I held 2 Master’s degrees in Mechanical and Nuclear Engineering. I had focused on plasma physics at

  11. Subject-specific left ventricular dysfunction modeling using composite material mechanics approach

    Science.gov (United States)

    Haddad, Seyed Mohammad Hassan; Karami, Elham; Samani, Abbas

    2017-03-01

    Diverse cardiac conditions such as myocardial infarction and hypertension can lead to diastolic dysfunction as a prevalent cardiac condition. Diastolic dysfunctions can be diagnosed through different adverse mechanisms such as abnormal left ventricle (LV) relaxation, filling, and diastolic stiffness. This paper is geared towards evaluating diastolic stiffness and measuring the LV blood pressure non-invasively. Diastolic stiffness is an important parameter which can be exploited for more accurate diagnosis of diastolic dysfunction. For this purpose, a finite element (FE) LV mechanical model, which works based on a novel composite material model of the cardiac tissue, was utilized. Here, this model was tested for inversion-based applications where it was applied for estimating the cardiac tissue passive stiffness mechanical properties as well as diastolic LV blood pressure. To this end, the model was applied to simulate diastolic inflation of the human LV. The start-diastolic LV geometry was obtained from MR image data segmentation of a healthy human volunteer. The obtained LV geometry was discretized into a FE mesh before FE simulation was conducted. The LV tissue stiffness and diastolic LV blood pressure were adjusted through optimization to achieve the best match between the calculated LV geometry and the one obtained from imaging data. The performance of the LV mechanical simulations using the optimal values of tissue stiffness and blood pressure was validated by comparing the geometrical parameters of the dilated LV model as well as the stress and strain distributions through the LV model with available measurements reported on the LV dilation.

  12. Computer Simulation of the Impact of Cigarette Smoking On Humans

    African Journals Online (AJOL)

    If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link ...

  13. Defect Detection in Composite Coatings by Computational Simulation Aided Thermography

    Science.gov (United States)

    Almeida, R. M.; Souza, M. P. V.; Rebello, J. M. A.

    2010-02-01

    Thermography is based on the measurement of superficial temperature distribution of an object inspected subjected to tension, normally thermal heat. This measurement is performed with a thermographic camera that detects the infrared radiation emitted by every object. In this work thermograph was simulated by COMSOL software for optimize experimental parameters in composite material coatings inspection.

  14. Computer Simulation Studies of CTG Triplet Repeat Sequences

    Science.gov (United States)

    Rasaiah, Jayendran. C.; Lynch, Joshua

    1998-03-01

    Long segments of CTG trinucleotide repeats in human DNA are correlated with a class of neurological diseases (myotonic dystrophy, fragile-X syndrome, and Kenndy's disease). These diseases are characterized by genetic anticipation and are thought to arise from replication errors caused by unusual conformations of CTG repeat segments. We have studied the properties of a single short segment of double starnded DNA with CTG repeats in 0.5 M sodium chloride solution with molecular dynamics simulations. The simulations are carried out in the micro canonical ensemble using an all-atom force field with CHARMM parameters. The TIPS3 water model is used to simulate a molecular solvent. Electrostatic interactions are calculated by Ewald summation and the equations of motion integrated using a Verlet algorithm in conjunction with SHAKE constrained dynamics to maintain bond lengths. The simulation of CTG repeat sequence is compared with a control system containing CAG triplet repeats to determine possible differencesin the conformation and elasticity of the two sequences.

  15. Electron wave collimation by conical horns : computer simulation

    NARCIS (Netherlands)

    Michielsen, K.; de Raedt, H.

    1991-01-01

    Results are presented of extensive numerical simulations of electron wave packets transmitted by horns. A detailed quantitative analysis is given of the collimation of the electron wave by horn-like devices. It is demonstrated that the electron wave collimation effect cannot be described in terms of

  16. Computer simulation of quantum phenomena in nano-scale devices

    NARCIS (Netherlands)

    Raedt, Hans De

    1996-01-01

    This paper reviews the general concepts for building algorithms to solve the time-dependent Schrödinger equation and to discuss ways of turning these concepts into unconditionally stable, accurate and efficient simulation algorithms. Applications to focussed electron emission from nano-scale

  17. Modeling and computational simulation of the osmotic evaporation process

    Directory of Open Access Journals (Sweden)

    Freddy Forero Longas

    2016-09-01

    Conclusions: It was found that for the conditions studied the Knudsen diffusion model is most suitable to describe the transfer of water vapor through the hydrophobic membrane. Simulations developed adequately describe the process of osmotic evaporation, becoming a tool for faster economic development of this technology.

  18. Computer Simulations Imply Forelimb-Dominated Underwater Flight in Plesiosaurs.

    Science.gov (United States)

    Liu, Shiqiu; Smith, Adam S; Gu, Yuting; Tan, Jie; Liu, C Karen; Turk, Greg

    2015-12-01

    Plesiosaurians are an extinct group of highly derived Mesozoic marine reptiles with a global distribution that spans 135 million years from the Early Jurassic to the Late Cretaceous. During their long evolutionary history they maintained a unique body plan with two pairs of large wing-like flippers, but their locomotion has been a topic of debate for almost 200 years. Key areas of controversy have concerned the most efficient biologically possible limb stroke, e.g. whether it consisted of rowing, underwater flight, or modified underwater flight, and how the four limbs moved in relation to each other: did they move in or out of phase? Previous studies have investigated plesiosaur swimming using a variety of methods, including skeletal analysis, human swimmers, and robotics. We adopt a novel approach using a digital, three-dimensional, articulated, free-swimming plesiosaur in a simulated fluid. We generated a large number of simulations under various joint degrees of freedom to investigate how the locomotory repertoire changes under different parameters. Within the biologically possible range of limb motion, the simulated plesiosaur swims primarily with its forelimbs using an unmodified underwater flight stroke, essentially the same as turtles and penguins. In contrast, the hindlimbs provide relatively weak thrust in all simulations. We conclude that plesiosaurs were forelimb-dominated swimmers that used their hind limbs mainly for maneuverability and stability.

  19. Computer Simulations Imply Forelimb-Dominated Underwater Flight in Plesiosaurs.

    Directory of Open Access Journals (Sweden)

    Shiqiu Liu

    2015-12-01

    Full Text Available Plesiosaurians are an extinct group of highly derived Mesozoic marine reptiles with a global distribution that spans 135 million years from the Early Jurassic to the Late Cretaceous. During their long evolutionary history they maintained a unique body plan with two pairs of large wing-like flippers, but their locomotion has been a topic of debate for almost 200 years. Key areas of controversy have concerned the most efficient biologically possible limb stroke, e.g. whether it consisted of rowing, underwater flight, or modified underwater flight, and how the four limbs moved in relation to each other: did they move in or out of phase? Previous studies have investigated plesiosaur swimming using a variety of methods, including skeletal analysis, human swimmers, and robotics. We adopt a novel approach using a digital, three-dimensional, articulated, free-swimming plesiosaur in a simulated fluid. We generated a large number of simulations under various joint degrees of freedom to investigate how the locomotory repertoire changes under different parameters. Within the biologically possible range of limb motion, the simulated plesiosaur swims primarily with its forelimbs using an unmodified underwater flight stroke, essentially the same as turtles and penguins. In contrast, the hindlimbs provide relatively weak thrust in all simulations. We conclude that plesiosaurs were forelimb-dominated swimmers that used their hind limbs mainly for maneuverability and stability.

  20. Computer simulations of the mechanical properties of metals

    DEFF Research Database (Denmark)

    Schiøtz, Jakob; Vegge, Tejs

    1999-01-01

    . Nanocrystline metals are metals with grain sizes in the nanometre range, they have a number of technologically interesting properties such as much increased hardness and yield strength. Our simulations show that the deformation mechanisms are different in these materials than in coarse-grained materials...

  1. Computer Simulation in the Teaching of Translation and International Studies.

    Science.gov (United States)

    Brecht, Richard D.; And Others

    1984-01-01

    Describes the National Simulation in International Studies and Translation Program which links international studies and foreign languages programs at a number of universities. This program provides a natural context for the exercise of translation for the language student and an authenticity of experience for students of international politics.…

  2. Inquiry-Based Whole-Class Teaching with Computer Simulations in Physics

    Science.gov (United States)

    Rutten, Nico; van der Veen, Jan T.; van Joolingen, Wouter R.

    2015-05-01

    In this study we investigated the pedagogical context of whole-class teaching with computer simulations. We examined relations between the attitudes and learning goals of teachers and their students regarding the use of simulations in whole-class teaching, and how teachers implement these simulations in their teaching practices. We observed lessons presented by 24 physics teachers in which they used computer simulations. Students completed questionnaires about the lesson, and each teacher was interviewed afterwards. These three data sources captured implementation by the teacher, and the learning goals and attitudes of students and their teachers regarding teaching with computer simulations. For each teacher, we calculated an Inquiry-Cycle-Score (ICS) based on the occurrence and order of the inquiry activities of predicting, observing and explaining during teaching, and a Student-Response-Rate (SRR) reflecting the level of active student participation. Statistical analyses revealed positive correlations between the inquiry-based character of the teaching approach and students' attitudes regarding its contribution to their motivation and insight, a negative correlation between the SRR and the ICS, and a positive correlation between teachers' attitudes about inquiry-based teaching with computer simulations and learning goal congruence between the teacher and his/her students. This means that active student participation is likely to be lower when the instruction more closely resembles the inquiry cycle, and that teachers with a positive attitude about inquiry-based teaching with computer simulations realize the importance of learning goal congruence.

  3. Economics of Scholarly Publishing: Exploring the Causes of Subscription Price Variations of Scholarly Journals in Business Subject-Specific Areas

    Science.gov (United States)

    Liu, Lewis G.

    2011-01-01

    This empirical research investigates subscription price variations of scholarly journals in five business subject-specific areas using the semilogarithmic regression model. It has two main purposes. The first is to address the unsettled debate over whether or not and to what extent commercial publishers reap monopoly profits by overcharging…

  4. Parallel computing with graphics processing units for high-speed Monte Carlo simulation of photon migration.

    Science.gov (United States)

    Alerstam, Erik; Svensson, Tomas; Andersson-Engels, Stefan

    2008-01-01

    General-purpose computing on graphics processing units (GPGPU) is shown to dramatically increase the speed of Monte Carlo simulations of photon migration. In a standard simulation of time-resolved photon migration in a semi-infinite geometry, the proposed methodology executed on a low-cost graphics processing unit (GPU) is a factor 1000 faster than simulation performed on a single standard processor. In addition, we address important technical aspects of GPU-based simulations of photon migration. The technique is expected to become a standard method in Monte Carlo simulations of photon migration.

  5. Subject-specific tendon-aponeurosis definition in Hill-type model predicts higher muscle forces in dynamic tasks.

    Directory of Open Access Journals (Sweden)

    Pauline Gerus

    Full Text Available Neuromusculoskeletal models are a common method to estimate muscle forces. Developing accurate neuromusculoskeletal models is a challenging task due to the complexity of the system and large inter-subject variability. The estimation of muscles force is based on the mechanical properties of tendon-aponeurosis complex. Most neuromusculoskeletal models use a generic definition of the tendon-aponeurosis complex based on in vitro test, perhaps limiting their validity. Ultrasonography allows subject-specific estimates of the tendon-aponeurosis complex's mechanical properties. The aim of this study was to investigate the influence of subject-specific mechanical properties of the tendon-aponeurosis complex on a neuromusculoskeletal model of the ankle joint. Seven subjects performed isometric contractions from which the tendon-aponeurosis force-strain relationship was estimated. Hopping and running tasks were performed and muscle forces were estimated using subject-specific tendon-aponeurosis and generic tendon properties. Two ultrasound probes positioned over the muscle-tendon junction and the mid-belly were combined with motion capture to estimate the in vivo tendon and aponeurosis strain of the medial head of gastrocnemius muscle. The tendon-aponeurosis force-strain relationship was scaled for the other ankle muscles based on tendon and aponeurosis length of each muscle measured by ultrasonography. The EMG-driven model was calibrated twice - using the generic tendon definition and a subject-specific tendon-aponeurosis force-strain definition. The use of subject-specific tendon-aponeurosis definition leads to a higher muscle force estimate for the soleus muscle and the plantar-flexor group, and to a better model prediction of the ankle joint moment compared to the model estimate which used a generic definition. Furthermore, the subject-specific tendon-aponeurosis definition leads to a decoupling behaviour between the muscle fibre and muscle-tendon unit

  6. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique.

    Science.gov (United States)

    Higaki, Toru; Tatsugami, Fuminari; Fujioka, Chikako; Sakane, Hiroaki; Nakamura, Yuko; Baba, Yasutaka; Iida, Makoto; Awai, Kazuo

    2017-08-01

    This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1-6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT). Hybrid iterative reconstruction (hybrid IR) and model-based iterative reconstruction (MBIR) were performed for the image reconstruction.

  7. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique

    OpenAIRE

    Toru Higaki; Fuminari Tatsugami; Chikako Fujioka; Hiroaki Sakane; Yuko Nakamura; Yasutaka Baba; Makoto Iida; Kazuo Awai

    2017-01-01

    This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1?6?mm made by 3D printer was scanned using 320-row detector computed tomography (CT). Hybrid iterative reconstruction (hybrid IR) and model-based iterative reconstruction (MBIR) were performed for the image reconstruction.

  8. Computer Simulation of a Three-phase Brushless Self-Excited Synchronous Generator

    OpenAIRE

    Cingoski, Vlatko; Mikami, Mitsuru; Yamashita, Hideo

    1999-01-01

    Computer simulation of the operating characteristics of a three-phase brushless synchronous generator with self-excited is presented. A voltage driven nonlinear time-periodic FEA is utilized to compute accurately the magnetic field distribution and the induced voltage and currents simultaneously.

  9. Visualization of simulated small vessels on computed tomography using a model-based iterative reconstruction technique

    Directory of Open Access Journals (Sweden)

    Toru Higaki

    2017-08-01

    Full Text Available This article describes a quantitative evaluation of visualizing small vessels using several image reconstruction methods in computed tomography. Simulated vessels with diameters of 1–6 mm made by 3D printer was scanned using 320-row detector computed tomography (CT. Hybrid iterative reconstruction (hybrid IR and model-based iterative reconstruction (MBIR were performed for the image reconstruction.

  10. The Simulation of an Oxidation-Reduction Titration Curve with Computer Algebra

    Science.gov (United States)

    Whiteley, Richard V., Jr.

    2015-01-01

    Although the simulation of an oxidation/reduction titration curve is an important exercise in an undergraduate course in quantitative analysis, that exercise is frequently simplified to accommodate computational limitations. With the use of readily available computer algebra systems, however, such curves for complicated systems can be generated…

  11. Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games

    Science.gov (United States)

    Nilsson, Elisabet M.; Jakobsson, Anders

    2011-01-01

    The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…

  12. Simulation of mixed bond graphs and block diagrams on personal computers using TUTSIM

    NARCIS (Netherlands)

    Beukeboom, J.J.A.J.; van Dixhoorn, J.J.; Meerman, J.W.

    1985-01-01

    The TUTSIM simulation program for continuous dynamic systems accepts (nonlinear) block diagrams, bond graphs or a free mix of both. The simulation is “hands on” interactive, providing a direct contact with the model. The implementation of the program on existing personal computers (Apple II, IBM PC)

  13. Computer simulation of viscous fingering in a lifting Hele-Shaw cell ...

    Indian Academy of Sciences (India)

    We simulate viscous fingering generated by separating two plates with a constant force, in a lifting Hele-Shaw cell. Variation in the patterns for different fluid viscosity and lifting force is studied. Viscous fingering is strongly affected by anisotropy. We report a computer simulation study of fingering patterns, where circular or ...

  14. Computer simulation of viscous fingering in a lifting Hele-Shaw cell ...

    Indian Academy of Sciences (India)

    We report a computer simulation study of fingering patterns, where circular or square grooves are etched on to the lower plate. Results are compared with experiments. Keywords. Viscous fingering; Hele-Shaw cell; simulation. PACS Nos 47.15.gp; 47.20.Gv; 07.05.Tp. 1. Introduction. Viscous fingering in the lifting Hele-Shaw ...

  15. Performance evaluation using SYSTID time domain simulation. [computer-aid design and analysis for communication systems

    Science.gov (United States)

    Tranter, W. H.; Ziemer, R. E.; Fashano, M. J.

    1975-01-01

    This paper reviews the SYSTID technique for performance evaluation of communication systems using time-domain computer simulation. An example program illustrates the language. The inclusion of both Gaussian and impulse noise models make accurate simulation possible in a wide variety of environments. A very flexible postprocessor makes possible accurate and efficient performance evaluation.

  16. YASS: A System Simulator for Operating System and Computer Architecture Teaching and Learning

    Science.gov (United States)

    Mustafa, Besim

    2013-01-01

    A highly interactive, integrated and multi-level simulator has been developed specifically to support both the teachers and the learners of modern computer technologies at undergraduate level. The simulator provides a highly visual and user configurable environment with many pedagogical features aimed at facilitating deep understanding of concepts…

  17. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    DEFF Research Database (Denmark)

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source t...

  18. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    Science.gov (United States)

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  19. Development of a Computer Simulation Game Using a Reverse Engineering Approach

    Science.gov (United States)

    Ozkul, Ahmet

    2012-01-01

    Business simulation games are widely used in the classroom to provide students with experiential learning opportunities on business situations in a dynamic fashion. When properly designed and implemented, the computer simulation game can be a useful educational tool by integrating separate theoretical concepts and demonstrating the nature of…

  20. ADAM: A computer program to simulate selective-breeding schemes for animals

    DEFF Research Database (Denmark)

    Pedersen, L D; Sørensen, A C; Henryon, M

    2009-01-01

    ADAM is a computer program that models selective breeding schemes for animals using stochastic simulation. The program simulates a population of animals and traces the genetic changes in the population under different selective breeding scenarios. It caters to different population structures......, genetic models, selection strategies, and mating designs. ADAM can be used to evaluate breeding schemes and generate genetic data to test statistical tools...