WorldWideScience

Sample records for computer simulated heart

  1. Xinyinqin: a computer-based heart sound simulator.

    Science.gov (United States)

    Zhan, X X; Pei, J H; Xiao, Y H

    1995-01-01

    "Xinyinqin" is the Chinese phoneticized name of the Heart Sound Simulator (HSS). The "qin" in "Xinyinqin" is the Chinese name of a category of musical instruments, which means that the operation of HSS is very convenient--like playing an electric piano with the keys. HSS is connected to the GAME I/O of an Apple microcomputer. The generation of sound is controlled by a program. Xinyinqin is used as a teaching aid of Diagnostics. It has been applied in teaching for three years. In this demonstration we will introduce the following functions of HSS: 1) The main program has two modules. The first one is the heart auscultation training module. HSS can output a heart sound selected by the student. Another program module is used to test the student's learning condition. The computer can randomly simulate a certain heart sound and ask the student to name it. The computer gives the student's answer an assessment: "correct" or "incorrect." When the answer is incorrect, the computer will output that heart sound again for the student to listen to; this process is repeated until she correctly identifies it. 2) The program is convenient to use and easy to control. By pressing the S key, it is able to output a slow heart rate until the student can clearly identify the rhythm. The heart rate, like the actual rate of a patient, can then be restored by hitting any key. By pressing the SPACE BAR, the heart sound output can be stopped to allow the teacher to explain something to the student. The teacher can resume playing the heart sound again by hitting any key; she can also change the content of the training by hitting RETURN key. In the future, we plan to simulate more heart sounds and incorporate relevant graphs.

  2. Computed Flow Through An Artificial Heart Valve

    Science.gov (United States)

    Rogers, Stewart E.; Kwak, Dochan; Kiris, Cetin; Chang, I-Dee

    1994-01-01

    Report discusses computations of blood flow through prosthetic tilting disk valve. Computational procedure developed in simulation used to design better artificial hearts and valves by reducing or eliminating following adverse flow characteristics: large pressure losses, which prevent hearts from working efficiently; separated and secondary flows, which causes clotting; and high turbulent shear stresses, which damages red blood cells. Report reiterates and expands upon part of NASA technical memorandum "Computed Flow Through an Artificial Heart and Valve" (ARC-12983). Also based partly on research described in "Numerical Simulation of Flow Through an Artificial Heart" (ARC-12478).

  3. Computed Flow Through An Artificial Heart And Valve

    Science.gov (United States)

    Rogers, Stuart E.; Kwak, Dochan; Kiris, Cetin; Chang, I-Dee

    1994-01-01

    NASA technical memorandum discusses computations of flow of blood through artificial heart and through tilting-disk artificial heart valve. Represents further progress in research described in "Numerical Simulation of Flow Through an Artificial Heart" (ARC-12478). One purpose of research to exploit advanced techniques of computational fluid dynamics and capabilities of supercomputers to gain understanding of complicated internal flows of viscous, essentially incompressible fluids like blood. Another to use understanding to design better artificial hearts and valves.

  4. Simulation of Blood flow in Artificial Heart Valve Design through Left heart

    Science.gov (United States)

    Hafizah Mokhtar, N.; Abas, Aizat

    2018-05-01

    In this work, an artificial heart valve is designed for use in real heart with further consideration on the effect of thrombosis, vorticity, and stress. The design of artificial heart valve model is constructed by Computer-aided design (CAD) modelling and simulated using Computational fluid dynamic (CFD) software. The effect of blood flow pattern, velocity and vorticity of the artificial heart valve design has been analysed in this research work. Based on the results, the artificial heart valve design shows that it has a Doppler velocity index that is less than the allowable standards for the left heart with values of more than 0.30 and less than 2.2. These values are safe to be used as replacement of the human heart valve.

  5. Numerical Simulation Of Flow Through An Artificial Heart

    Science.gov (United States)

    Rogers, Stuart; Kutler, Paul; Kwak, Dochan; Kiris, Centin

    1991-01-01

    Research in both artificial hearts and fluid dynamics benefits from computational studies. Algorithm that implements Navier-Stokes equations of flow extended to simulate flow of viscous, incompressible blood through articifial heart. Ability to compute details of such flow important for two reasons: internal flows with moving boundaries of academic interest in their own right, and many of deficiencies of artificial hearts attributable to dynamics of flow.

  6. Hemodynamic simulation of the heart using a 2D model and MR data

    DEFF Research Database (Denmark)

    Adeler, Pernille Thorup

    2002-01-01

    Computational models of the blood flow in the heart are a useful tool for studying the functioning of the heart. The purpose of this thesis is to achieve a better understanding of hemodynamics of the normal and diseased hearts through the use of a computational model and magnetic resonance (MR......) data. We present a 2D computational model of the blood flow in the left side of the heart. The work is based on Peskin and McQueen's 2D model dimensioned to data on the dog heart, which we improve and adjust using physiological knowledge and MR velocity data to achieve a model of the human heart...... by letting the apical region be inactive. In both of these cases the simulation results compare well with clinically observed data on dogs and humans. We present Peskin and McQueen's 3D model of the entire human heart and the nearby great vessels. We perform a simulation with the model, where we adjust...

  7. Computational high-resolution heart phantoms for medical imaging and dosimetry simulations

    Energy Technology Data Exchange (ETDEWEB)

    Gu Songxiang; Kyprianou, Iacovos [Center for Devices and Radiological Health, US Food and Drug Administration, Silver Spring, MD (United States); Gupta, Rajiv, E-mail: songxiang.gu@fda.hhs.gov, E-mail: rgupta1@partners.org, E-mail: iacovos.kyprianou@fda.hhs.gov [Massachusetts General Hospital, Boston, MA (United States)

    2011-09-21

    Cardiovascular disease in general and coronary artery disease (CAD) in particular, are the leading cause of death worldwide. They are principally diagnosed using either invasive percutaneous transluminal coronary angiograms or non-invasive computed tomography angiograms (CTA). Minimally invasive therapies for CAD such as angioplasty and stenting are rendered under fluoroscopic guidance. Both invasive and non-invasive imaging modalities employ ionizing radiation and there is concern for deterministic and stochastic effects of radiation. Accurate simulation to optimize image quality with minimal radiation dose requires detailed, gender-specific anthropomorphic phantoms with anatomically correct heart and associated vasculature. Such phantoms are currently unavailable. This paper describes an open source heart phantom development platform based on a graphical user interface. Using this platform, we have developed seven high-resolution cardiac/coronary artery phantoms for imaging and dosimetry from seven high-quality CTA datasets. To extract a phantom from a coronary CTA, the relationship between the intensity distribution of the myocardium, the ventricles and the coronary arteries is identified via histogram analysis of the CTA images. By further refining the segmentation using anatomy-specific criteria such as vesselness, connectivity criteria required by the coronary tree and image operations such as active contours, we are able to capture excellent detail within our phantoms. For example, in one of the female heart phantoms, as many as 100 coronary artery branches could be identified. Triangular meshes are fitted to segmented high-resolution CTA data. We have also developed a visualization tool for adding stenotic lesions to the coronaries. The male and female heart phantoms generated so far have been cross-registered and entered in the mesh-based Virtual Family of phantoms with matched age/gender information. Any phantom in this family, along with user

  8. Computational high-resolution heart phantoms for medical imaging and dosimetry simulations

    International Nuclear Information System (INIS)

    Gu Songxiang; Kyprianou, Iacovos; Gupta, Rajiv

    2011-01-01

    Cardiovascular disease in general and coronary artery disease (CAD) in particular, are the leading cause of death worldwide. They are principally diagnosed using either invasive percutaneous transluminal coronary angiograms or non-invasive computed tomography angiograms (CTA). Minimally invasive therapies for CAD such as angioplasty and stenting are rendered under fluoroscopic guidance. Both invasive and non-invasive imaging modalities employ ionizing radiation and there is concern for deterministic and stochastic effects of radiation. Accurate simulation to optimize image quality with minimal radiation dose requires detailed, gender-specific anthropomorphic phantoms with anatomically correct heart and associated vasculature. Such phantoms are currently unavailable. This paper describes an open source heart phantom development platform based on a graphical user interface. Using this platform, we have developed seven high-resolution cardiac/coronary artery phantoms for imaging and dosimetry from seven high-quality CTA datasets. To extract a phantom from a coronary CTA, the relationship between the intensity distribution of the myocardium, the ventricles and the coronary arteries is identified via histogram analysis of the CTA images. By further refining the segmentation using anatomy-specific criteria such as vesselness, connectivity criteria required by the coronary tree and image operations such as active contours, we are able to capture excellent detail within our phantoms. For example, in one of the female heart phantoms, as many as 100 coronary artery branches could be identified. Triangular meshes are fitted to segmented high-resolution CTA data. We have also developed a visualization tool for adding stenotic lesions to the coronaries. The male and female heart phantoms generated so far have been cross-registered and entered in the mesh-based Virtual Family of phantoms with matched age/gender information. Any phantom in this family, along with user

  9. Computational modeling and engineering in pediatric and congenital heart disease.

    Science.gov (United States)

    Marsden, Alison L; Feinstein, Jeffrey A

    2015-10-01

    Recent methodological advances in computational simulations are enabling increasingly realistic simulations of hemodynamics and physiology, driving increased clinical utility. We review recent developments in the use of computational simulations in pediatric and congenital heart disease, describe the clinical impact in modeling in single-ventricle patients, and provide an overview of emerging areas. Multiscale modeling combining patient-specific hemodynamics with reduced order (i.e., mathematically and computationally simplified) circulatory models has become the de-facto standard for modeling local hemodynamics and 'global' circulatory physiology. We review recent advances that have enabled faster solutions, discuss new methods (e.g., fluid structure interaction and uncertainty quantification), which lend realism both computationally and clinically to results, highlight novel computationally derived surgical methods for single-ventricle patients, and discuss areas in which modeling has begun to exert its influence including Kawasaki disease, fetal circulation, tetralogy of Fallot (and pulmonary tree), and circulatory support. Computational modeling is emerging as a crucial tool for clinical decision-making and evaluation of novel surgical methods and interventions in pediatric cardiology and beyond. Continued development of modeling methods, with an eye towards clinical needs, will enable clinical adoption in a wide range of pediatric and congenital heart diseases.

  10. Acoustic heart. Interpretation of Phonocardiograms by computer

    International Nuclear Information System (INIS)

    Granados, J; Tavera, F; Velázquez, J M; Hernández, R T; Morales, A; López, G

    2015-01-01

    In the field of Cardiology have been identified several heart pathologies associated with problems in valves and narrowing in veins. Each case is associated with a specific sound emitted by the heart, detected in cardiac auscultation. On the Phonocardiogram, sound is visualized as a peak in the wave. In the Optics Laboratory of the Universidad Autonoma Metropolitana – Azcapotzalco, we have developed a simulation of the Phonocardiograms of heart sounds associated with the main pathologies and a computer program of recognition of images that allows you to quickly identify the respective diseases. This is a novel way to analyze Phonocardiograms and the foundation for building a portable non-invasive cardiac diagnostic computerized analyzer system

  11. Training auscultatory skills: computer simulated heart sounds or additional bedside training? A randomized trial on third-year medical students

    Science.gov (United States)

    2010-01-01

    Background The present study compares the value of additional use of computer simulated heart sounds, to conventional bedside auscultation training, on the cardiac auscultation skills of 3rd year medical students at Oslo University Medical School. Methods In addition to their usual curriculum courses, groups of seven students each were randomized to receive four hours of additional auscultation training either employing a computer simulator system or adding on more conventional bedside training. Cardiac auscultation skills were afterwards tested using live patients. Each student gave a written description of the auscultation findings in four selected patients, and was rewarded from 0-10 points for each patient. Differences between the two study groups were evaluated using student's t-test. Results At the auscultation test no significant difference in mean score was found between the students who had used additional computer based sound simulation compared to additional bedside training. Conclusions Students at an early stage of their cardiology training demonstrated equal performance of cardiac auscultation whether they had received an additional short auscultation course based on computer simulated training, or had had additional bedside training. PMID:20082701

  12. Non-conforming finite-element formulation for cardiac electrophysiology: an effective approach to reduce the computation time of heart simulations without compromising accuracy

    Science.gov (United States)

    Hurtado, Daniel E.; Rojas, Guillermo

    2018-04-01

    Computer simulations constitute a powerful tool for studying the electrical activity of the human heart, but computational effort remains prohibitively high. In order to recover accurate conduction velocities and wavefront shapes, the mesh size in linear element (Q1) formulations cannot exceed 0.1 mm. Here we propose a novel non-conforming finite-element formulation for the non-linear cardiac electrophysiology problem that results in accurate wavefront shapes and lower mesh-dependance in the conduction velocity, while retaining the same number of global degrees of freedom as Q1 formulations. As a result, coarser discretizations of cardiac domains can be employed in simulations without significant loss of accuracy, thus reducing the overall computational effort. We demonstrate the applicability of our formulation in biventricular simulations using a coarse mesh size of ˜ 1 mm, and show that the activation wave pattern closely follows that obtained in fine-mesh simulations at a fraction of the computation time, thus improving the accuracy-efficiency trade-off of cardiac simulations.

  13. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  14. Integrative computational models of cardiac arrhythmias -- simulating the structurally realistic heart

    Science.gov (United States)

    Trayanova, Natalia A; Tice, Brock M

    2009-01-01

    Simulation of cardiac electrical function, and specifically, simulation aimed at understanding the mechanisms of cardiac rhythm disorders, represents an example of a successful integrative multiscale modeling approach, uncovering emergent behavior at the successive scales in the hierarchy of structural complexity. The goal of this article is to present a review of the integrative multiscale models of realistic ventricular structure used in the quest to understand and treat ventricular arrhythmias. It concludes with the new advances in image-based modeling of the heart and the promise it holds for the development of individualized models of ventricular function in health and disease. PMID:20628585

  15. Simulation of Exercise-Induced Syncope in a Heart Model with Severe Aortic Valve Stenosis

    Directory of Open Access Journals (Sweden)

    Matjaž Sever

    2012-01-01

    Full Text Available Severe aortic valve stenosis (AVS can cause an exercise-induced reflex syncope (RS. The precise mechanism of this syncope is not known. The changes in hemodynamics are variable, including arrhythmias and myocardial ischemia, and one of the few consistent changes is a sudden fall in systemic and pulmonary arterial pressures (suggesting a reduced vascular resistance followed by a decline in heart rate. The contribution of the cardioinhibitory and vasodepressor components of the RS to hemodynamics was evaluated by a computer model. This lumped-parameter computer simulation was based on equivalent electronic circuits (EECs that reflect the hemodynamic conditions of a heart with severe AVS and a concomitantly decreased contractility as a long-term detrimental consequence of compensatory left ventricular hypertrophy. In addition, the EECs model simulated the resetting of the sympathetic nervous tone in the heart and systemic circuit during exercise and exercise-induced syncope, the fluctuating intra-thoracic pressure during respiration, and the passive relaxation of ventricle during diastole. The results of this simulation were consistent with the published case reports of exertional syncope in patients with AVS. The value of the EEC model is its ability to quantify the effect of a selective and gradable change in heart rate, ventricular contractility, or systemic vascular resistance on the hemodynamics during an exertional syncope in patients with severe AVS.

  16. Characterization of cardiac flow in heart disease patients by computational fluid dynamics and 4D flow MRI

    Science.gov (United States)

    Lantz, Jonas; Gupta, Vikas; Henriksson, Lilian; Karlsson, Matts; Persson, Ander; Carhall, Carljohan; Ebbers, Tino

    2017-11-01

    In this study, cardiac blood flow was simulated using Computational Fluid Dynamics and compared to in vivo flow measurements by 4D Flow MRI. In total, nine patients with various heart diseases were studied. Geometry and heart wall motion for the simulations were obtained from clinical CT measurements, with 0.3x0.3x0.3 mm spatial resolution and 20 time frames covering one heartbeat. The CFD simulations included pulmonary veins, left atrium and ventricle, mitral and aortic valve, and ascending aorta. Mesh sizes were on the order of 6-16 million cells, depending on the size of the heart, in order to resolve both papillary muscles and trabeculae. The computed flow field agreed visually very well with 4D Flow MRI, with characteristic vortices and flow structures seen in both techniques. Regression analysis showed that peak flow rate as well as stroke volume had an excellent agreement for the two techniques. We demonstrated the feasibility, and more importantly, fidelity of cardiac flow simulations by comparing CFD results to in vivo measurements. Both qualitative and quantitative results agreed well with the 4D Flow MRI measurements. Also, the developed simulation methodology enables ``what if'' scenarios, such as optimization of valve replacement and other surgical procedures. Funded by the Wallenberg Foundation.

  17. Improving auscultatory proficiency using computer simulated heart sounds

    Directory of Open Access Journals (Sweden)

    Hanan Salah EL-Deen Mohamed EL-Halawany

    2016-09-01

    Full Text Available This study aimed to examine the effects of 'Heart Sounds', a web-based program on improving fifth-year medical students' auscultation skill in a medical school in Egypt. This program was designed for medical students to master cardiac auscultation skills in addition to their usual clinical medical courses. Pre- and post-tests were performed to assess students' auscultation skill improvement. Upon completing the training, students were required to complete a questionnaire to reflect on the learning experience they developed through 'Heart Sounds' program. Results from pre- and post-tests revealed a significant improvement in students' auscultation skills. In examining male and female students' pre- and post-test results, we found that both of male and female students had achieved a remarkable improvement in their auscultation skills. On the other hand, students stated clearly that the learning experience they had with 'Heart Sounds' program was different than any other traditional ways of teaching. They stressed that the program had significantly improved their auscultation skills and enhanced their self-confidence in their ability to practice those skills. It is also recommended that 'Heart Sounds' program learning experience should be extended by assessing students' practical improvement in real life situations.

  18. Simulation of Blood flow in Different Configurations Design of Bi-leaflet Mechanical Heart Valve

    Science.gov (United States)

    Hafizah Mokhtar, N.; Abas, Aizat

    2018-05-01

    In this work, two different designs of artificial heart valve were devised and then compared by considering the thrombosis, wear and valve orifice to anatomical orifice ratio of each mechanical heart valve. These different design configurations of bi-leaflet mechanical heart valves model are created through the use of Computer-aided design (CAD) modelling and simulated using Computational fluid dynamic (CFD) software. Design 1 is based on existing conventional bi-leaflet valve and design 2 based on modified bi-leaflet respectively. The flow pattern, velocity, vorticity and stress analysis have been done to justify the best design. Based on results, both of the designs show a Doppler velocity index of less than the allowable standard of 2.2 which is safe to be used as replacement of the human heart valve. However, design 2 shows that it has a lower possibility of cavitation issue which will lead to lower thrombosis and provide good central flow area of blood as compared to design 1.

  19. Fast Simulation of Mechanical Heterogeneity in the Electrically Asynchronous Heart Using the MultiPatch Module.

    Directory of Open Access Journals (Sweden)

    John Walmsley

    2015-07-01

    Full Text Available Cardiac electrical asynchrony occurs as a result of cardiac pacing or conduction disorders such as left bundle-branch block (LBBB. Electrically asynchronous activation causes myocardial contraction heterogeneity that can be detrimental for cardiac function. Computational models provide a tool for understanding pathological consequences of dyssynchronous contraction. Simulations of mechanical dyssynchrony within the heart are typically performed using the finite element method, whose computational intensity may present an obstacle to clinical deployment of patient-specific models. We present an alternative based on the CircAdapt lumped-parameter model of the heart and circulatory system, called the MultiPatch module. Cardiac walls are subdivided into an arbitrary number of patches of homogeneous tissue. Tissue properties and activation time can differ between patches. All patches within a wall share a common wall tension and curvature. Consequently, spatial location within the wall is not required to calculate deformation in a patch. We test the hypothesis that activation time is more important than tissue location for determining mechanical deformation in asynchronous hearts. We perform simulations representing an experimental study of myocardial deformation induced by ventricular pacing, and a patient with LBBB and heart failure using endocardial recordings of electrical activation, wall volumes, and end-diastolic volumes. Direct comparison between simulated and experimental strain patterns shows both qualitative and quantitative agreement between model fibre strain and experimental circumferential strain in terms of shortening and rebound stretch during ejection. Local myofibre strain in the patient simulation shows qualitative agreement with circumferential strain patterns observed in the patient using tagged MRI. We conclude that the MultiPatch module produces realistic regional deformation patterns in the asynchronous heart and that

  20. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    Science.gov (United States)

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  1. Fluid Structure Interaction simulation of heart prosthesis in patient-specific left-ventricle/aorta anatomies

    Science.gov (United States)

    Le, Trung; Borazjani, Iman; Sotiropoulos, Fotis

    2009-11-01

    In order to test and optimize heart valve prosthesis and enable virtual implantation of other biomedical devices it is essential to develop and validate high-resolution FSI-CFD codes for carrying out simulations in patient-specific geometries. We have developed a powerful numerical methodology for carrying out FSI simulations of cardiovascular flows based on the CURVIB approach (Borazjani, L. Ge, and F. Sotiropoulos, Journal of Computational physics, vol. 227, pp. 7587-7620 2008). We have extended our FSI method to overset grids to handle efficiently more complicated geometries e.g. simulating an MHV implanted in an anatomically realistic aorta and left-ventricle. A compliant, anatomic left-ventricle is modeled using prescribed motion in one domain. The mechanical heart valve is placed inside the second domain i.e. the body-fitted curvilinear mesh of the anatomic aorta. The simulations of an MHV with a left-ventricle model underscore the importance of inflow conditions and ventricular compliance for such simulations and demonstrate the potential of our method as a powerful tool for patient-specific simulations.

  2. Exercise physiology with a left ventricular assist device: Analysis of heart-pump interaction with a computational simulator.

    Science.gov (United States)

    Fresiello, Libera; Rademakers, Frank; Claus, Piet; Ferrari, Gianfranco; Di Molfetta, Arianna; Meyns, Bart

    2017-01-01

    Patients with a Ventricular Assist Device (VAD) are hemodynamically stable but show an impaired exercise capacity. Aim of this work is to identify and to describe the limiting factors of exercise physiology with a VAD. We searched for data concerning exercise in heart failure condition and after VAD implantation from the literature. Data were analyzed by using a cardiorespiratory simulator that worked as a collector of inputs coming from different papers. As a preliminary step the simulator was used to reproduce the evolution of hemodynamics from rest to peak exercise (ergometer cycling) in heart failure condition. Results evidence an increase of cardiac output of +2.8 l/min and a heart rate increase to 67% of the expected value. Then, we simulated the effect of a continuous-flow VAD at both rest and exercise. Total cardiac output increases of +3.0 l/min (+0.9 l/min due to the VAD and +2.1 l/min to the native ventricle). Since the left ventricle works in a non-linear portion of the diastolic stiffness line, we observed a consistent increase of pulmonary capillary wedge pressure (from 14 to 20 mmHg) for a relatively small increase of end-diastolic volume (from 182 to 189 cm3). We finally increased VAD speed during exercise to the maximum possible value and we observed a reduction of wedge pressure (-4.5 mmHg), a slight improvement of cardiac output (8.0 l/min) and a complete unloading of the native ventricle. The VAD can assure a proper hemodynamics at rest, but provides an insufficient unloading of the left ventricle and does not prevent wedge pressure from rising during exercise. Neither the VAD provides major benefits during exercise in terms of total cardiac output, which increases to a similar extend to an unassisted heart failure condition. VAD speed modulation can contribute to better unload the ventricle but the maximal flow reachable with the current devices is below the cardiac output observed in a healthy heart.

  3. Simple method to estimate mean heart dose from Hodgkin lymphoma radiation therapy according to simulation X-rays.

    Science.gov (United States)

    van Nimwegen, Frederika A; Cutter, David J; Schaapveld, Michael; Rutten, Annemarieke; Kooijman, Karen; Krol, Augustinus D G; Janus, Cécile P M; Darby, Sarah C; van Leeuwen, Flora E; Aleman, Berthe M P

    2015-05-01

    To describe a new method to estimate the mean heart dose for Hodgkin lymphoma patients treated several decades ago, using delineation of the heart on radiation therapy simulation X-rays. Mean heart dose is an important predictor for late cardiovascular complications after Hodgkin lymphoma (HL) treatment. For patients treated before the era of computed tomography (CT)-based radiotherapy planning, retrospective estimation of radiation dose to the heart can be labor intensive. Patients for whom cardiac radiation doses had previously been estimated by reconstruction of individual treatments on representative CT data sets were selected at random from a case-control study of 5-year Hodgkin lymphoma survivors (n=289). For 42 patients, cardiac contours were outlined on each patient's simulation X-ray by 4 different raters, and the mean heart dose was estimated as the percentage of the cardiac contour within the radiation field multiplied by the prescribed mediastinal dose and divided by a correction factor obtained by comparison with individual CT-based dosimetry. According to the simulation X-ray method, the medians of the mean heart doses obtained from the cardiac contours outlined by the 4 raters were 30 Gy, 30 Gy, 31 Gy, and 31 Gy, respectively, following prescribed mediastinal doses of 25-42 Gy. The absolute-agreement intraclass correlation coefficient was 0.93 (95% confidence interval 0.85-0.97), indicating excellent agreement. Mean heart dose was 30.4 Gy with the simulation X-ray method, versus 30.2 Gy with the representative CT-based dosimetry, and the between-method absolute-agreement intraclass correlation coefficient was 0.87 (95% confidence interval 0.80-0.95), indicating good agreement between the two methods. Estimating mean heart dose from radiation therapy simulation X-rays is reproducible and fast, takes individual anatomy into account, and yields results comparable to the labor-intensive representative CT-based method. This simpler method may produce a

  4. Simple Method to Estimate Mean Heart Dose From Hodgkin Lymphoma Radiation Therapy According to Simulation X-Rays

    Energy Technology Data Exchange (ETDEWEB)

    Nimwegen, Frederika A. van [Department of Psychosocial Research, Epidemiology, and Biostatistics, The Netherlands Cancer Institute, Amsterdam (Netherlands); Cutter, David J. [Clinical Trial Service Unit, University of Oxford, Oxford (United Kingdom); Oxford Cancer Centre, Oxford University Hospitals NHS Trust, Oxford (United Kingdom); Schaapveld, Michael [Department of Psychosocial Research, Epidemiology, and Biostatistics, The Netherlands Cancer Institute, Amsterdam (Netherlands); Rutten, Annemarieke [Department of Radiology, The Netherlands Cancer Institute, Amsterdam (Netherlands); Kooijman, Karen [Department of Psychosocial Research, Epidemiology, and Biostatistics, The Netherlands Cancer Institute, Amsterdam (Netherlands); Krol, Augustinus D.G. [Department of Radiation Oncology, Leiden University Medical Center, Leiden (Netherlands); Janus, Cécile P.M. [Department of Radiation Oncology, Erasmus MC Cancer Center, Rotterdam (Netherlands); Darby, Sarah C. [Clinical Trial Service Unit, University of Oxford, Oxford (United Kingdom); Leeuwen, Flora E. van [Department of Psychosocial Research, Epidemiology, and Biostatistics, The Netherlands Cancer Institute, Amsterdam (Netherlands); Aleman, Berthe M.P., E-mail: b.aleman@nki.nl [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam (Netherlands)

    2015-05-01

    Purpose: To describe a new method to estimate the mean heart dose for Hodgkin lymphoma patients treated several decades ago, using delineation of the heart on radiation therapy simulation X-rays. Mean heart dose is an important predictor for late cardiovascular complications after Hodgkin lymphoma (HL) treatment. For patients treated before the era of computed tomography (CT)-based radiotherapy planning, retrospective estimation of radiation dose to the heart can be labor intensive. Methods and Materials: Patients for whom cardiac radiation doses had previously been estimated by reconstruction of individual treatments on representative CT data sets were selected at random from a case–control study of 5-year Hodgkin lymphoma survivors (n=289). For 42 patients, cardiac contours were outlined on each patient's simulation X-ray by 4 different raters, and the mean heart dose was estimated as the percentage of the cardiac contour within the radiation field multiplied by the prescribed mediastinal dose and divided by a correction factor obtained by comparison with individual CT-based dosimetry. Results: According to the simulation X-ray method, the medians of the mean heart doses obtained from the cardiac contours outlined by the 4 raters were 30 Gy, 30 Gy, 31 Gy, and 31 Gy, respectively, following prescribed mediastinal doses of 25-42 Gy. The absolute-agreement intraclass correlation coefficient was 0.93 (95% confidence interval 0.85-0.97), indicating excellent agreement. Mean heart dose was 30.4 Gy with the simulation X-ray method, versus 30.2 Gy with the representative CT-based dosimetry, and the between-method absolute-agreement intraclass correlation coefficient was 0.87 (95% confidence interval 0.80-0.95), indicating good agreement between the two methods. Conclusion: Estimating mean heart dose from radiation therapy simulation X-rays is reproducible and fast, takes individual anatomy into account, and yields results comparable to the labor

  5. Computer simulation comparison of tripolar, bipolar, and spline Laplacian electrocadiogram estimators.

    Science.gov (United States)

    Chen, T; Besio, W; Dai, W

    2009-01-01

    A comparison of the performance of the tripolar and bipolar concentric as well as spline Laplacian electrocardiograms (LECGs) and body surface Laplacian mappings (BSLMs) for localizing and imaging the cardiac electrical activation has been investigated based on computer simulation. In the simulation a simplified eccentric heart-torso sphere-cylinder homogeneous volume conductor model were developed. Multiple dipoles with different orientations were used to simulate the underlying cardiac electrical activities. Results show that the tripolar concentric ring electrodes produce the most accurate LECG and BSLM estimation among the three estimators with the best performance in spatial resolution.

  6. Simulation of blood flow through an artificial heart

    Science.gov (United States)

    Kiris, Cetin; Chang, I-Dee; Rogers, Stuart E.; Kwak, Dochan

    1991-01-01

    A numerical simulation of the incompressible viscous flow through a prosthetic tilting disk heart valve is presented in order to demonstrate the current capability to model unsteady flows with moving boundaries. Both steady state and unsteady flow calculations are done by solving the incompressible Navier-Stokes equations in 3-D generalized curvilinear coordinates. In order to handle the moving boundary problems, the chimera grid embedding scheme which decomposes a complex computational domain into several simple subdomains is used. An algebraic turbulence model for internal flows is incorporated to reach the physiological values of Reynolds number. Good agreement is obtained between the numerical results and experimental measurements. It is found that the tilting disk valve causes large regions of separated flow, and regions of high shear.

  7. Evaluation of valvular heart diseases with computed tomography

    International Nuclear Information System (INIS)

    Tomoda, Haruo; Hoshiai, Mitsumoto; Matsuyama, Seiya

    1982-01-01

    Forty-two patients with valvular heart diseases were studied with a third-generation computed tomographic system. The cardiac chambers (the atria and ventricles) were evaluated semiquantitatively, and valvular calcification was easily detected with computed tomography. Computed tomography was most valuable in revealing left atrial thrombi which were not identified by other diagnostic procedures in some cases. (author)

  8. Simulation of the Beating Heart Based on Physically Modeling a Deformable Balloon

    International Nuclear Information System (INIS)

    Rohmer, Damien; Sitek, Arkadiusz; Gullberg, Grant T.

    2006-01-01

    The motion of the beating heart is complex and creates artifacts in SPECT and x-ray CT images. Phantoms such as the Jaszczak Dynamic Cardiac Phantom are used to simulate cardiac motion for evaluation of acquisition and data processing protocols used for cardiac imaging. Two concentric elastic membranes filled with water are connected to tubing and pump apparatus for creating fluid flow in and out of the inner volume to simulate motion of the heart. In the present report, the movement of two concentric balloons is solved numerically in order to create a computer simulation of the motion of the moving membranes in the Jaszczak Dynamic Cardiac Phantom. A system of differential equations, based on the physical properties, determine the motion. Two methods are tested for solving the system of differential equations. The results of both methods are similar providing a final shape that does not converge to a trivial circular profile. Finally, a tomographic imaging simulation is performed by acquiring static projections of the moving shape and reconstructing the result to observe motion artifacts. Two cases are taken into account: in one case each projection angle is sampled for a short time interval and the other case is sampled for a longer time interval. The longer sampling acquisition shows a clear improvement in decreasing the tomographic streaking artifacts

  9. Emerging Trends in Heart Valve Engineering: Part IV. Computational Modeling and Experimental Studies.

    Science.gov (United States)

    Kheradvar, Arash; Groves, Elliott M; Falahatpisheh, Ahmad; Mofrad, Mohammad K; Hamed Alavi, S; Tranquillo, Robert; Dasi, Lakshmi P; Simmons, Craig A; Jane Grande-Allen, K; Goergen, Craig J; Baaijens, Frank; Little, Stephen H; Canic, Suncica; Griffith, Boyce

    2015-10-01

    In this final portion of an extensive review of heart valve engineering, we focus on the computational methods and experimental studies related to heart valves. The discussion begins with a thorough review of computational modeling and the governing equations of fluid and structural interaction. We then move onto multiscale and disease specific modeling. Finally, advanced methods related to in vitro testing of the heart valves are reviewed. This section of the review series is intended to illustrate application of computational methods and experimental studies and their interrelation for studying heart valves.

  10. Computed tomography in the diagnosis of pericardial heart disease

    International Nuclear Information System (INIS)

    Isner, J.M.; Carter, B.L.; Bankoff, M.S.; Konstam, M.A.; Salem, D.N.

    1982-01-01

    To evaluate the use of computed tomography (CT) in the diagnosis of pericardial heart disease, 53 patients were prospectively studied by computed tomography of the chest and cardiac ultrasound. A diagnostic-quality CT study was done for all patients; a technically satisfactory ultrasound examination was not possible in six patients. Of 47 patients in whom both chest scans and satisfactory ultrasound studies were obtained, computed tomography showed pericardial thickening not shown by ultrasound in five patients. Estimated size of pericardial effusion was the same for both computed tomography and ultrasound. Computed tomography provided quantifiable evaluation of the composition of pericardial fluid in seven patients with either hemopericardium or purulent pericarditis. Neoplastic pericardial heart disease was detected by CT scan in four of the 53 patients. Computed tomography of the chest provides a sensitive evaluation of the pericardium and quality of pericardial effusion, and is a valuable adjunct in patients in whom cardiac ultrasound is technically unsatisfactory

  11. Scientific computer simulation review

    International Nuclear Information System (INIS)

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  12. Electrophysiological and structural remodeling in heart failure modulate arrhythmogenesis. 2D simulation study.

    Directory of Open Access Journals (Sweden)

    Juan F Gomez

    Full Text Available Heart failure is operationally defined as the inability of the heart to maintain blood flow to meet the needs of the body and it is the final common pathway of various cardiac pathologies. Electrophysiological remodeling, intercellular uncoupling and a pro-fibrotic response have been identified as major arrhythmogenic factors in heart failure.In this study we investigate vulnerability to reentry under heart failure conditions by incorporating established electrophysiological and anatomical remodeling using computer simulations.The electrical activity of human transmural ventricular tissue (5 cm × 5 cm was simulated using the human ventricular action potential model Grandi et al. under control and heart failure conditions. The MacCannell et al. model was used to model fibroblast electrical activity, and their electrotonic interactions with myocytes. Selected degrees of diffuse fibrosis and variations in intercellular coupling were considered and the vulnerable window (VW for reentry was evaluated following cross-field stimulation.No reentry was observed in normal conditions or in the presence of HF ionic remodeling. However, defined amount of fibrosis and/or cellular uncoupling were sufficient to elicit reentrant activity. Under conditions where reentry was generated, HF electrophysiological remodeling did not alter the width of the VW. However, intermediate fibrosis and cellular uncoupling significantly widened the VW. In addition, biphasic behavior was observed, as very high fibrotic content or very low tissue conductivity hampered the development of reentry. Detailed phase analysis of reentry dynamics revealed an increase of phase singularities with progressive fibrotic components.Structural remodeling is a key factor in the genesis of vulnerability to reentry. A range of intermediate levels of fibrosis and intercellular uncoupling can combine to favor reentrant activity.

  13. Experience with computed transmission tomography of the heart in vivo

    International Nuclear Information System (INIS)

    Carlsson, E.; Lipton, M.J.; Skioeldebrand, C.G.; Berninger, W.H.; Redington, R.W.

    1980-01-01

    Cardiac computed tomography in its present form provides useful information about the heart for clinical use in patients with heart disease and for investigative work in such patients and living animals. Its great reconstructive power and unmatched density resolution are particularly advantageous in the study of ischemic heart disease. Because of its non-invasive character cardiac computed tomography has the potential of becoming an effective screening tool for large numbers of patients with suspected or known coronary heart desiase. Other cardiac conditions such as valve disease and congenital lesions can also be examined with high diagnostic yield. However presently available scanners suffer from low repetion rate, long scan times and the fact that only one transverse cardiac level at a time can be obtained. The development which must be accomplished in order to eliminate these weaknesses is technically feasible. The availability of a dynamic cardiac scanner would greatly benefit the treatment of patients with heart disease and facilitate the inquiry into the pathophysiology of such diseases. (orig.) [de

  14. COMPUTER MODELING IN THE DEVELOPMENT OF ARTIFICIAL VENTRICLES OF HEART

    Directory of Open Access Journals (Sweden)

    L. V. Belyaev

    2011-01-01

    Full Text Available In article modern researches of processes of development of artificial ventricles of heart are described. Advanta- ges of application computer (CAD/CAE technologies are shown by development of artificial ventricles of heart. The systems developed with application of the given technologies are submitted. 

  15. A randomised, simulated study assessing auscultation of heart rate at birth

    NARCIS (Netherlands)

    Voogdt, Kevin G. J. A.; Morrison, Allison C.; Wood, Fiona E.; van Elburg, Ruurd M.; Wyllie, Jonathan P.

    2010-01-01

    Heart rate is a primary clinical indicator directing newborn resuscitation. The time taken to assess the heart rate by auscultation in relation to accuracy during newborn resuscitation is not known. To assess both the accuracy and time taken to assess heart rate by stethoscope in simulated

  16. Atomic-level computer simulation

    International Nuclear Information System (INIS)

    Adams, J.B.; Rockett, Angus; Kieffer, John; Xu Wei; Nomura, Miki; Kilian, K.A.; Richards, D.F.; Ramprasad, R.

    1994-01-01

    This paper provides a broad overview of the methods of atomic-level computer simulation. It discusses methods of modelling atomic bonding, and computer simulation methods such as energy minimization, molecular dynamics, Monte Carlo, and lattice Monte Carlo. ((orig.))

  17. Three-dimentional simulation of flow-induced platelet activation in artificial heart valves

    Science.gov (United States)

    Hedayat, Mohammadali; Asgharzadeh, Hafez; Borazjani, Iman

    2015-11-01

    Since the advent of heart valve, several valve types such as mechanical and bio-prosthetic valves have been designed. Mechanical Heart Valves (MHV) are durable but suffer from thromboembolic complications that caused by shear-induced platelet activation near the valve region. Bio-prosthetic Heart Valves (BHV) are known for better hemodynamics. However, they usually have a short average life time. Realistic simulations of heart valves in combination with platelet activation models can lead to a better understanding of the potential risk of thrombus formation in such devices. In this study, an Eulerian approach is developed to calculate the platelet activation in three-dimensional simulations of flow through MHV and BHV using a parallel overset-curvilinear immersed boundary technique. A curvilinear body-fitted grid is used for the flow simulation through the anatomic aorta, while the sharp-interface immersed boundary method is used for simulation of the Left Ventricle (LV) with prescribed motion. In addition, dynamics of valves were calculated numerically using under-relaxed strong-coupling algorithm. Finally, the platelet activation results for BMV and MHV are compared with each other.

  18. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  19. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software

  20. FPGA-accelerated simulation of computer systems

    CERN Document Server

    Angepat, Hari; Chung, Eric S; Hoe, James C; Chung, Eric S

    2014-01-01

    To date, the most common form of simulators of computer systems are software-based running on standard computers. One promising approach to improve simulation performance is to apply hardware, specifically reconfigurable hardware in the form of field programmable gate arrays (FPGAs). This manuscript describes various approaches of using FPGAs to accelerate software-implemented simulation of computer systems and selected simulators that incorporate those techniques. More precisely, we describe a simulation architecture taxonomy that incorporates a simulation architecture specifically designed f

  1. Iris features-based heart disease diagnosis by computer vision

    Science.gov (United States)

    Nguchu, Benedictor A.; Li, Li

    2017-07-01

    The study takes advantage of several new breakthroughs in computer vision technology to develop a new mid-irisbiomedical platform that processes iris image for early detection of heart-disease. Guaranteeing early detection of heart disease provides a possibility of having non-surgical treatment as suggested by biomedical researchers and associated institutions. However, our observation discovered that, a clinical practicable solution which could be both sensible and specific for early detection is still lacking. Due to this, the rate of majority vulnerable to death is highly increasing. The delayed diagnostic procedures, inefficiency, and complications of available methods are the other reasons for this catastrophe. Therefore, this research proposes the novel IFB (Iris Features Based) method for diagnosis of premature, and early stage heart disease. The method incorporates computer vision and iridology to obtain a robust, non-contact, nonradioactive, and cost-effective diagnostic tool. The method analyzes abnormal inherent weakness in tissues, change in color and patterns, of a specific region of iris that responds to impulses of heart organ as per Bernard Jensen-iris Chart. The changes in iris infer the presence of degenerative abnormalities in heart organ. These changes are precisely detected and analyzed by IFB method that includes, tensor-based-gradient(TBG), multi orientations gabor filters(GF), textural oriented features(TOF), and speed-up robust features(SURF). Kernel and Multi class oriented support vector machines classifiers are used for classifying normal and pathological iris features. Experimental results demonstrated that the proposed method, not only has better diagnostic performance, but also provides an insight for early detection of other diseases.

  2. Massively parallel quantum computer simulator

    NARCIS (Netherlands)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel Computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray

  3. Computer-assisted instruction; MR imaging of congenital heart disease

    International Nuclear Information System (INIS)

    Choi, Young Hi; Yu, Pil Mun; Lee, Sang Hoon; Choe, Yeon Hyeon; Kim, Yang Min

    1996-01-01

    To develop a software program for computer-assisted instruction on MR imaging of congenital heart disease for medical students and residents to achieve repetitive and effective self-learning. We used a film scanner(Scan Maker 35t) and IBM-PC(486 DX-2, 60 MHz) for acquisition and storage of image data. The accessories attached to the main processor were CD-ROM drive(Sony), sound card(Soundblaster-Pro), and speaker. We used software of Adobe Photoshop(v 3.0) and paint shop-pro(v 3.0) for preprocessing image data, and paintbrush from microsoft windows 3.1 for labelling. The language used for programming was visual basic(v 3.0) from microsoft corporation. We developed a software program for computer-assisted instruction on MR imaging of congenital heart disease as an effective educational tool

  4. VASA: Interactive Computational Steering of Large Asynchronous Simulation Pipelines for Societal Infrastructure.

    Science.gov (United States)

    Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S

    2014-12-01

    We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.

  5. Mechanical analysis of congestive heart failure caused by bundle branch block based on an electromechanical canine heart model

    Energy Technology Data Exchange (ETDEWEB)

    Dou Jianhong; Xia Ling; Zhang Yu; Shou Guofa [Department of Biomedical Engineering, Zhejiang University, Hangzhou 310027 (China); Wei Qing; Liu Feng; Crozier, Stuart [School of Information Technology and Electrical Engineering, University of Queensland, St Lucia, Brisbane, Queensland 4072 (Australia)], E-mail: xialing@zju.edu.cn

    2009-01-21

    Asynchronous electrical activation, induced by bundle branch block (BBB), can cause reduced ventricular function. However, the effects of BBB on the mechanical function of heart are difficult to assess experimentally. Many heart models have been developed to investigate cardiac properties during BBB but have mainly focused on the electrophysiological properties. To date, the mechanical function of BBB has not been well investigated. Based on a three-dimensional electromechanical canine heart model, the mechanical properties of complete left and right bundle branch block (LBBB and RBBB) were simulated. The anatomical model as well as the fiber orientations of a dog heart was reconstructed from magnetic resonance imaging (MRI) and diffusion tensor MRI (DT-MRI). Using the solutions of reaction-diffusion equations and with a strategy of parallel computation, the asynchronous excitation propagation and intraventricular conduction in BBB was simulated. The mechanics of myocardial tissues were computed with time-, sarcomere length-dependent uniaxial active stress initiated at the time of depolarization. The quantification of mechanical intra- and interventricular asynchrony of BBB was then investigated using the finite-element method with an eight-node isoparametric element. The simulation results show that (1) there exists inter- and intraventricular systolic dyssynchrony during BBB; (2) RBBB may have more mechanical synchrony and better systolic function of the left ventricle (LV) than LBBB; (3) the ventricles always move toward the early-activated ventricle; and (4) the septum experiences higher stress than left and right ventricular free walls in BBB. The simulation results validate clinical and experimental recordings of heart deformation and provide regional quantitative estimates of ventricular wall strain and stress. The present work suggests that an electromechanical heart model, incorporating real geometry and fiber orientations, may be helpful for better

  6. Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)

    2011-08-15

    This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.

  7. Significance of computed tomography for diagnosis of heart diseases

    International Nuclear Information System (INIS)

    Senda, Kohei; Sakuma, Sadayuki

    1983-01-01

    Computed tomography (CT) with a 2 sec scanner was carried out on 105 cases with various heart disease in order to detect CT findings in each heart disease. Significance of CT as a imaging study was evaluated in comparison with scintigraphic, echographic and roentgenographic studies. CT with contrast enhancement in moderate inspiration was able to demonstrate accurately organic changes of intra-and extracardiac structure. Comparing with other imaging studies, CT was superior in detection of calcified or intracardiac mass lesion in spite of low value in evaluating cardiac function or dynamics. (author)

  8. Distributed simulation of large computer systems

    International Nuclear Information System (INIS)

    Marzolla, M.

    2001-01-01

    Sequential simulation of large complex physical systems is often regarded as a computationally expensive task. In order to speed-up complex discrete-event simulations, the paradigm of Parallel and Distributed Discrete Event Simulation (PDES) has been introduced since the late 70s. The authors analyze the applicability of PDES to the modeling and analysis of large computer system; such systems are increasingly common in the area of High Energy and Nuclear Physics, because many modern experiments make use of large 'compute farms'. Some feasibility tests have been performed on a prototype distributed simulator

  9. A Computational Study on the Relation between Resting Heart Rate and Atrial Fibrillation Hemodynamics under Exercise.

    Directory of Open Access Journals (Sweden)

    Matteo Anselmino

    Full Text Available Clinical data indicating a heart rate (HR target during rate control therapy for permanent atrial fibrillation (AF and assessing its eventual relationship with reduced exercise tolerance are lacking. The present study aims at investigating the impact of resting HR on the hemodynamic response to exercise in permanent AF patients by means of a computational cardiovascular model.The AF lumped-parameter model was run to simulate resting (1 Metabolic Equivalent of Task-MET and various exercise conditions (4 METs: brisk walking; 6 METs: skiing; 8 METs: running, considering different resting HR (70 bpm for the slower resting HR-SHR-simulations, and 100 bpm for the higher resting HR-HHR-simulations. To compare relative variations of cardiovascular variables upon exertion, the variation comparative index (VCI-the absolute variation between the exercise and the resting values in SHR simulations referred to the absolute variation in HHR simulations-was calculated at each exercise grade (VCI4, VCI6 and VCI8.Pulmonary venous pressure underwent a greater increase in HHR compared to SHR simulations (VCI4 = 0.71, VCI6 = 0.73 and VCI8 = 0.77, while for systemic arterial pressure the opposite is true (VCI4 = 1.15, VCI6 = 1.36, VCI8 = 1.56.The computational findings suggest that a slower, with respect to a higher resting HR, might be preferable in permanent AF patients, since during exercise pulmonary venous pressure undergoes a slighter increase and systemic blood pressure reveals a more appropriate increase.

  10. A Computational Study on the Relation between Resting Heart Rate and Atrial Fibrillation Hemodynamics under Exercise.

    Science.gov (United States)

    Anselmino, Matteo; Scarsoglio, Stefania; Saglietto, Andrea; Gaita, Fiorenzo; Ridolfi, Luca

    2017-01-01

    Clinical data indicating a heart rate (HR) target during rate control therapy for permanent atrial fibrillation (AF) and assessing its eventual relationship with reduced exercise tolerance are lacking. The present study aims at investigating the impact of resting HR on the hemodynamic response to exercise in permanent AF patients by means of a computational cardiovascular model. The AF lumped-parameter model was run to simulate resting (1 Metabolic Equivalent of Task-MET) and various exercise conditions (4 METs: brisk walking; 6 METs: skiing; 8 METs: running), considering different resting HR (70 bpm for the slower resting HR-SHR-simulations, and 100 bpm for the higher resting HR-HHR-simulations). To compare relative variations of cardiovascular variables upon exertion, the variation comparative index (VCI)-the absolute variation between the exercise and the resting values in SHR simulations referred to the absolute variation in HHR simulations-was calculated at each exercise grade (VCI4, VCI6 and VCI8). Pulmonary venous pressure underwent a greater increase in HHR compared to SHR simulations (VCI4 = 0.71, VCI6 = 0.73 and VCI8 = 0.77), while for systemic arterial pressure the opposite is true (VCI4 = 1.15, VCI6 = 1.36, VCI8 = 1.56). The computational findings suggest that a slower, with respect to a higher resting HR, might be preferable in permanent AF patients, since during exercise pulmonary venous pressure undergoes a slighter increase and systemic blood pressure reveals a more appropriate increase.

  11. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  12. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  13. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  14. Computer simulation of ductile fracture

    International Nuclear Information System (INIS)

    Wilkins, M.L.; Streit, R.D.

    1979-01-01

    Finite difference computer simulation programs are capable of very accurate solutions to problems in plasticity with large deformations and rotation. This opens the possibility of developing models of ductile fracture by correlating experiments with equivalent computer simulations. Selected experiments were done to emphasize different aspects of the model. A difficult problem is the establishment of a fracture-size effect. This paper is a study of the strain field around notched tensile specimens of aluminum 6061-T651. A series of geometrically scaled specimens are tested to fracture. The scaled experiments are conducted for different notch radius-to-diameter ratios. The strains at fracture are determined from computer simulations. An estimate is made of the fracture-size effect

  15. Simulating chemistry using quantum computers.

    Science.gov (United States)

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  16. Kymogram detection and kymogram-correlated image reconstruction from subsecond spiral computed tomography scans of the heart

    International Nuclear Information System (INIS)

    Kachelriess, Marc; Sennst, Dirk-Alexander; Maxlmoser, Wolfgang; Kalender, Willi A.

    2002-01-01

    Subsecond single-slice, multi-slice or cone-beam spiral computed tomography (SSCT, MSCT, CBCT) offer great potential for improving heart imaging. Together with the newly developed phase-correlated cardiac reconstruction algorithms 180 deg. MCD and 180 deg. MCI [Med. Phys. 27, 1881-1902 (2000)] or related algorithms provided by the CT manufacturers, high image quality can be achieved. These algorithms require information about the cardiac motion, i.e., typically the simultaneously recorded electrocardiogram (ECG), to synchronize the reconstruction with the cardiac motion. Neither data acquired without ECG information (standard patients) nor acquisitions with corrupted ECG information can be handled adequately. We developed a method to extract the appropriate information about cardiac motion directly from the measured raw data (projection data). The so-called kymogram function is a measure of the cardiac motion as a function of time t or as a function of the projection angle α. In contrast to the ECG which is a global measure of the heart's electric excitation, the kymogram is a local measure of the heart motion at the z-position z(α) at projection angle α. The patient's local heart rate as well as the necessary synchronization information to be used with phase-correlated algorithms can be extracted from the kymogram by using a series of signal processing steps. The kymogram information is shown to be adequate to substitute the ECG information. Computer simulations with simulated ECG and patient measurements with simultaneously acquired ECG were carried out for a multislice scanner providing M=4 slices to evaluate these new approaches. Both the ECG function and the kymogram function were used for reconstruction. Both were highly correlated regarding the periodicity information used for reconstruction. In 21 out of 25 consecutive cases the kymogram approach was equivalent to the ECG-correlated reconstruction; only minor differences in image quality between both

  17. HTTR plant dynamic simulation using a hybrid computer

    International Nuclear Information System (INIS)

    Shimazaki, Junya; Suzuki, Katsuo; Nabeshima, Kunihiko; Watanabe, Koichi; Shinohara, Yoshikuni; Nakagawa, Shigeaki.

    1990-01-01

    A plant dynamic simulation of High-Temperature Engineering Test Reactor has been made using a new-type hybrid computer. This report describes a dynamic simulation model of HTTR, a hybrid simulation method for SIMSTAR and some results obtained from dynamics analysis of HTTR simulation. It concludes that the hybrid plant simulation is useful for on-line simulation on account of its capability of computation at high speed, compared with that of all digital computer simulation. With sufficient accuracy, 40 times faster computation than real time was reached only by changing an analog time scale for HTTR simulation. (author)

  18. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  19. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  20. Computer Simulation Western

    International Nuclear Information System (INIS)

    Rasmussen, H.

    1992-01-01

    Computer Simulation Western is a unit within the Department of Applied Mathematics at the University of Western Ontario. Its purpose is the development of computational and mathematical methods for practical problems in industry and engineering and the application and marketing of such methods. We describe the unit and our efforts at obtaining research and development grants. Some representative projects will be presented and future plans discussed. (author)

  1. General-purpose parallel simulator for quantum computing

    International Nuclear Information System (INIS)

    Niwa, Jumpei; Matsumoto, Keiji; Imai, Hiroshi

    2002-01-01

    With current technologies, it seems to be very difficult to implement quantum computers with many qubits. It is therefore of importance to simulate quantum algorithms and circuits on the existing computers. However, for a large-size problem, the simulation often requires more computational power than is available from sequential processing. Therefore, simulation methods for parallel processors are required. We have developed a general-purpose simulator for quantum algorithms/circuits on the parallel computer (Sun Enterprise4500). It can simulate algorithms/circuits with up to 30 qubits. In order to test efficiency of our proposed methods, we have simulated Shor's factorization algorithm and Grover's database search, and we have analyzed robustness of the corresponding quantum circuits in the presence of both decoherence and operational errors. The corresponding results, statistics, and analyses are presented in this paper

  2. Advanced computers and simulation

    International Nuclear Information System (INIS)

    Ryne, R.D.

    1993-01-01

    Accelerator physicists today have access to computers that are far more powerful than those available just 10 years ago. In the early 1980's, desktop workstations performed less one million floating point operations per second (Mflops), and the realized performance of vector supercomputers was at best a few hundred Mflops. Today vector processing is available on the desktop, providing researchers with performance approaching 100 Mflops at a price that is measured in thousands of dollars. Furthermore, advances in Massively Parallel Processors (MPP) have made performance of over 10 gigaflops a reality, and around mid-decade MPPs are expected to be capable of teraflops performance. Along with advances in MPP hardware, researchers have also made significant progress in developing algorithms and software for MPPS. These changes have had, and will continue to have, a significant impact on the work of computational accelerator physicists. Now, instead of running particle simulations with just a few thousand particles, we can perform desktop simulations with tens of thousands of simulation particles, and calculations with well over 1 million particles are being performed on MPPs. In the area of computational electromagnetics, simulations that used to be performed only on vector supercomputers now run in several hours on desktop workstations, and researchers are hoping to perform simulations with over one billion mesh points on future MPPs. In this paper we will discuss the latest advances, and what can be expected in the near future, in hardware, software and applications codes for advanced simulation of particle accelerators

  3. Computer simulations of collisionless shock waves

    International Nuclear Information System (INIS)

    Leroy, M.M.

    1984-01-01

    A review of the contributions of particle computer simulations to the understanding of the physics of magnetic shock waves in collisionless plasmas is presented. The emphasis is on the relation between the computer simulation results, spacecraft observations of shocks in space, and related theories, rather than on technical aspects of the numerics. It is shown that much has been learned from the comparison of ISEE spacecraft observations of the terrestrial bow shock and particle computer simulations concerning the quasi-perpendicular, supercritical shock (ion scale structure, ion reflection mechanism and ultimate dissipation processes). Particle computer simulations have also had an appreciable prospective role in the investigation of the physics of quasi-parallel shocks, about which still little is known observationally. Moreover, these numerical techniques have helped to clarify the process of suprathermal ion rejection by the shock into the foreshock, and the subsequent evolution of the ions in the foreshock. 95 references

  4. Validation of a model to investigate the effects of modifying cardiovascular disease (CVD) risk factors on the burden of CVD: the rotterdam ischemic heart disease and stroke computer simulation (RISC) model.

    Science.gov (United States)

    van Kempen, Bob J H; Ferket, Bart S; Hofman, Albert; Steyerberg, Ewout W; Colkesen, Ersen B; Boekholdt, S Matthijs; Wareham, Nicholas J; Khaw, Kay-Tee; Hunink, M G Myriam

    2012-12-06

    We developed a Monte Carlo Markov model designed to investigate the effects of modifying cardiovascular disease (CVD) risk factors on the burden of CVD. Internal, predictive, and external validity of the model have not yet been established. The Rotterdam Ischemic Heart Disease and Stroke Computer Simulation (RISC) model was developed using data covering 5 years of follow-up from the Rotterdam Study. To prove 1) internal and 2) predictive validity, the incidences of coronary heart disease (CHD), stroke, CVD death, and non-CVD death simulated by the model over a 13-year period were compared with those recorded for 3,478 participants in the Rotterdam Study with at least 13 years of follow-up. 3) External validity was verified using 10 years of follow-up data from the European Prospective Investigation of Cancer (EPIC)-Norfolk study of 25,492 participants, for whom CVD and non-CVD mortality was compared. At year 5, the observed incidences (with simulated incidences in brackets) of CHD, stroke, and CVD and non-CVD mortality for the 3,478 Rotterdam Study participants were 5.30% (4.68%), 3.60% (3.23%), 4.70% (4.80%), and 7.50% (7.96%), respectively. At year 13, these percentages were 10.60% (10.91%), 9.90% (9.13%), 14.20% (15.12%), and 24.30% (23.42%). After recalibrating the model for the EPIC-Norfolk population, the 10-year observed (simulated) incidences of CVD and non-CVD mortality were 3.70% (4.95%) and 6.50% (6.29%). All observed incidences fell well within the 95% credibility intervals of the simulated incidences. We have confirmed the internal, predictive, and external validity of the RISC model. These findings provide a basis for analyzing the effects of modifying cardiovascular disease risk factors on the burden of CVD with the RISC model.

  5. Validation of a model to investigate the effects of modifying cardiovascular disease (CVD risk factors on the burden of CVD: the rotterdam ischemic heart disease and stroke computer simulation (RISC model

    Directory of Open Access Journals (Sweden)

    van Kempen Bob JH

    2012-12-01

    Full Text Available Abstract Background We developed a Monte Carlo Markov model designed to investigate the effects of modifying cardiovascular disease (CVD risk factors on the burden of CVD. Internal, predictive, and external validity of the model have not yet been established. Methods The Rotterdam Ischemic Heart Disease and Stroke Computer Simulation (RISC model was developed using data covering 5 years of follow-up from the Rotterdam Study. To prove 1 internal and 2 predictive validity, the incidences of coronary heart disease (CHD, stroke, CVD death, and non-CVD death simulated by the model over a 13-year period were compared with those recorded for 3,478 participants in the Rotterdam Study with at least 13 years of follow-up. 3 External validity was verified using 10 years of follow-up data from the European Prospective Investigation of Cancer (EPIC-Norfolk study of 25,492 participants, for whom CVD and non-CVD mortality was compared. Results At year 5, the observed incidences (with simulated incidences in brackets of CHD, stroke, and CVD and non-CVD mortality for the 3,478 Rotterdam Study participants were 5.30% (4.68%, 3.60% (3.23%, 4.70% (4.80%, and 7.50% (7.96%, respectively. At year 13, these percentages were 10.60% (10.91%, 9.90% (9.13%, 14.20% (15.12%, and 24.30% (23.42%. After recalibrating the model for the EPIC-Norfolk population, the 10-year observed (simulated incidences of CVD and non-CVD mortality were 3.70% (4.95% and 6.50% (6.29%. All observed incidences fell well within the 95% credibility intervals of the simulated incidences. Conclusions We have confirmed the internal, predictive, and external validity of the RISC model. These findings provide a basis for analyzing the effects of modifying cardiovascular disease risk factors on the burden of CVD with the RISC model.

  6. Computer algebra simulation - what can it do?; Was leistet Computer-Algebra-Simulation?

    Energy Technology Data Exchange (ETDEWEB)

    Braun, S. [Visual Analysis AG, Muenchen (Germany)

    2001-07-01

    Shortened development times require new and improved calculation methods. Numeric methods have long become state of the art. However, although numeric simulations provide a better understanding of process parameters, they do not give a feast overview of the interdependences between parameters. Numeric simulations are effective only if all physical parameters are sufficiently known; otherwise, the efficiency will decrease due to the large number of variant calculations required. Computer algebra simulation closes this gap and provides a deeper understanding of the physical fundamentals of technical processes. [German] Neue und verbesserte Berechnungsmethoden sind notwendig, um die staendige Verkuerzung der Entwicklungszyklen zu ermoeglichen. Herkoemmliche Methoden, die auf einem rein numerischen Ansatz basieren, haben sich in vielen Anwendungsbereichen laengst zum Standard entwickelt. Aber nicht nur die staendig kuerzer werdenden Entwicklungszyklen, sondern auch die weiterwachsende Komplexitaet machen es notwendig, ein besseres Verstaendnis der beteiligten Prozessparameter zu gewinnen. Die numerische Simulation besticht zwar durch Detailloesungen, selbst bei komplexen Strukturen und Prozessen, allerdings liefert sie keine schnelle Abschaetzung ueber die Zusammenhaenge zwischen den einzelnen Parametern. Die numerische Simulation ist nur dann effektiv, wenn alle physikalischen Parameter hinreichend bekannt sind; andernfalls sinkt die Effizienz durch die notwendige Anzahl von notwendigen Variantenrechnungen sehr stark. Die Computer-Algebra-Simulation schliesst diese Luecke in dem sie es erlaubt, sich einen tieferen Einblick in die physikalische Funktionsweise technischer Prozesse zu verschaffen. (orig.)

  7. Framework for utilizing computational devices within simulation

    Directory of Open Access Journals (Sweden)

    Miroslav Mintál

    2013-12-01

    Full Text Available Nowadays there exist several frameworks to utilize a computation power of graphics cards and other computational devices such as FPGA, ARM and multi-core processors. The best known are either low-level and need a lot of controlling code or are bounded only to special graphic cards. Furthermore there exist more specialized frameworks, mainly aimed to the mathematic field. Described framework is adjusted to use in a multi-agent simulations. Here it provides an option to accelerate computations when preparing simulation and mainly to accelerate a computation of simulation itself.

  8. Simulation of dilated heart failure with continuous flow circulatory support.

    Directory of Open Access Journals (Sweden)

    Yajuan Wang

    Full Text Available Lumped parameter models have been employed for decades to simulate important hemodynamic couplings between a left ventricular assist device (LVAD and the native circulation. However, these studies seldom consider the pathological descending limb of the Frank-Starling response of the overloaded ventricle. This study introduces a dilated heart failure model featuring a unimodal end systolic pressure-volume relationship (ESPVR to address this critical shortcoming. The resulting hemodynamic response to mechanical circulatory support are illustrated through numerical simulations of a rotodynamic, continuous flow ventricular assist device (cfVAD coupled to systemic and pulmonary circulations with baroreflex control. The model further incorporated septal interaction to capture the influence of left ventricular (LV unloading on right ventricular function. Four heart failure conditions were simulated (LV and bi-ventricular failure with/without pulmonary hypertension in addition to normal baseline. Several metrics of LV function, including cardiac output and stroke work, exhibited a unimodal response whereby initial unloading improved function, and further unloading depleted preload reserve thereby reducing ventricular output. The concept of extremal loading was introduced to reflect the loading condition in which the intrinsic LV stroke work is maximized. Simulation of bi-ventricular failure with pulmonary hypertension revealed inadequacy of LV support alone. These simulations motivate the implementation of an extremum tracking feedback controller to potentially optimize ventricular recovery.

  9. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    Science.gov (United States)

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  10. Impact of heart rate and rhythm on radiation exposure in prospectively ECG triggered computed tomography

    International Nuclear Information System (INIS)

    Luecke, Christian; Andres, Claudia; Foldyna, Borek; Nagel, Hans Dieter; Hoffmann, Janine; Grothoff, Matthias; Nitzsche, Stefan; Gutberlet, Matthias; Lehmkuhl, Lukas

    2012-01-01

    Purpose: To evaluate the influence of different heart rates and arrhythmias on scanner performance, image acquisition and applied radiation exposure in prospectively ECG triggered computed tomography (pCT). Materials and methods: An ECG simulator (EKG Phantom 320, Müller and Sebastiani Elektronik GmbH, Munich, Germany) was used to generate different heart rhythms and arrhythmias: sinus rhythm (SR) at 45, 60, 75, 90 and 120/min, supraventricular arrhythmias (e.g. sinus arrhythmia, atrial fibrillation) and ventricular arrhythmias (e.g. ventricular extrasystoles), pacemaker-ECGs, ST-changes and technical artifacts. The analysis of the image acquisition process was performed on a 64-row multidetector CT (Brilliance, Philips Medical Systems, Cleveland, USA). A prospectively triggered scan protocol as used for routine was applied (120 kV; 150 mA s; 0.4 s rotation and exposure time per scan; image acquisition predominantly in end-diastole at 75% R-R-interval, in arrythmias with a mean heart rate above 80/min in systole at 45% of the R-R-interval; FOV 25 cm). The mean dose length product (DLP) and its percentage increase from baseline (SR at 60/min) were determined. Result: Radiation exposure can increase significantly when the heart rhythm deviates from sinus rhythm. ECG-changes leading to a significant DLP increase (p 8 s) could be observed in bifocal pacemaker (12.8 s), pacemaker dysfunction (10.7 s), atrial fibrillation (10.3 s) and sinus arrhythmia (9.3 s). Conclusion: In prospectively ECG triggered CT, heart rate and rhythm can provoke different types of scanner performance, which can significantly alter radiation exposure and scan time. These results might have an important implication for indication, informed consent and contrast agent injection protocols

  11. Numerical characteristics of quantum computer simulation

    Science.gov (United States)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  12. Analyzing Robotic Kinematics Via Computed Simulations

    Science.gov (United States)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  13. Computer Simulations, Disclosure and Duty of Care

    Directory of Open Access Journals (Sweden)

    John Barlow

    2006-05-01

    Full Text Available Computer simulations provide cost effective methods for manipulating and modeling 'reality'. However they are not real. They are imitations of a system or event, real or fabricated, and as such mimic, duplicate or represent that system or event. The degree to which a computer simulation aligns with and reproduces the ‘reality’ of the system or event it attempts to mimic or duplicate depends upon many factors including the efficiency of the simulation algorithm, the processing power of the computer hardware used to run the simulation model, and the expertise, assumptions and prejudices of those concerned with designing, implementing and interpreting the simulation output. Computer simulations in particular are increasingly replacing physical experimentation in many disciplines, and as a consequence, are used to underpin quite significant decision-making which may impact on ‘innocent’ third parties. In this context, this paper examines two interrelated issues: Firstly, how much and what kind of information should a simulation builder be required to disclose to potential users of the simulation? Secondly, what are the implications for a decision-maker who acts on the basis of their interpretation of a simulation output without any reference to its veracity, which may in turn comprise the safety of other parties?

  14. Creating science simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  15. Simulation of a small computer of the TRA-1001 type on the BESM computer

    International Nuclear Information System (INIS)

    Galaktionov, V.V.

    1975-01-01

    Considered are the purpose and probable simulation ways of one computer by the other. The emulator (simulation program) is given for a small computer of TRA-1001 type on BESM-6 computer. The simulated computer basic elements are the following: memory (8 K words), central processor, input-output program channel, interruption circuit, computer panel. The work with the input-output devices, teletypes ASP-33, FS-1500 is also simulated. Under actual operation the emulator has been used for translating the programs prepared on punched cards with the aid of translator SLANG-1 by BESM-6 computer. The translator alignment from language COPLAN has been realized with the aid of the emulator

  16. A computer-simulated liver phantom (virtual liver phantom) for multidetector computed tomography evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Funama, Yoshinori [Kumamoto University, Department of Radiological Sciences, School of Health Sciences, Kumamoto (Japan); Awai, Kazuo; Nakayama, Yoshiharu; Liu, Da; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Miyazaki, Osamu; Goto, Taiga [Hitachi Medical Corporation, Tokyo (Japan); Hori, Shinichi [Gate Tower Institute of Image Guided Therapy, Osaka (Japan)

    2006-04-15

    The purpose of study was to develop a computer-simulated liver phantom for hepatic CT studies. A computer-simulated liver phantom was mathematically constructed on a computer workstation. The computer-simulated phantom was calibrated using real CT images acquired by an actual four-detector CT. We added an inhomogeneous texture to the simulated liver by referring to CT images of chronically damaged human livers. The mean CT number of the simulated liver was 60 HU and we added numerous 5-to 10-mm structures with 60{+-}10 HU/mm. To mimic liver tumors we added nodules measuring 8, 10, and 12 mm in diameter with CT numbers of 60{+-}10, 60{+-}15, and 60{+-}20 HU. Five radiologists visually evaluated similarity of the texture of the computer-simulated liver phantom and a real human liver to confirm the appropriateness of the virtual liver images using a five-point scale. The total score was 44 in two radiologists, and 42, 41, and 39 in one radiologist each. They evaluated that the textures of virtual liver were comparable to those of human liver. Our computer-simulated liver phantom is a promising tool for the evaluation of the image quality and diagnostic performance of hepatic CT imaging. (orig.)

  17. Computer Simulations of Lipid Bilayers and Proteins

    DEFF Research Database (Denmark)

    Sonne, Jacob

    2006-01-01

    The importance of computer simulations in lipid bilayer research has become more prominent for the last couple of decades and as computers get even faster, simulations will play an increasingly important part of understanding the processes that take place in and across cell membranes. This thesis...... entitled Computer simulations of lipid bilayers and proteins describes two molecular dynamics (MD) simulation studies of pure lipid bilayers as well as a study of a transmembrane protein embedded in a lipid bilayer matrix. Below follows a brief overview of the thesis. Chapter 1. This chapter is a short...... in the succeeding chapters is presented. Details on system setups, simulation parameters and other technicalities can be found in the relevant chapters. Chapter 3, DPPC lipid parameters: The quality of MD simulations is intimately dependent on the empirical potential energy function and its parameters, i...

  18. Radiotherapy Monte Carlo simulation using cloud computing technology.

    Science.gov (United States)

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  19. Radiotherapy Monte Carlo simulation using cloud computing technology

    International Nuclear Information System (INIS)

    Poole, C.M.; Cornelius, I.; Trapp, J.V.; Langton, C.M.

    2012-01-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  20. Impact of heart rate and rhythm on radiation exposure in prospectively ECG triggered computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Luecke, Christian, E-mail: neep@gmx.de [University of Leipzig – Heart Center, Department of Diagnostic and Interventional Radiology, Strümpellstrasse 39, D-04289, Leipzig (Germany); Andres, Claudia; Foldyna, Borek [University of Leipzig – Heart Center, Department of Diagnostic and Interventional Radiology, Strümpellstrasse 39, D-04289, Leipzig (Germany); Nagel, Hans Dieter [Wissenschaft and Technik für die Radiologie, Buchhholz i.d.N (Germany); Hoffmann, Janine; Grothoff, Matthias; Nitzsche, Stefan; Gutberlet, Matthias; Lehmkuhl, Lukas [University of Leipzig – Heart Center, Department of Diagnostic and Interventional Radiology, Strümpellstrasse 39, D-04289, Leipzig (Germany)

    2012-09-15

    Purpose: To evaluate the influence of different heart rates and arrhythmias on scanner performance, image acquisition and applied radiation exposure in prospectively ECG triggered computed tomography (pCT). Materials and methods: An ECG simulator (EKG Phantom 320, Müller and Sebastiani Elektronik GmbH, Munich, Germany) was used to generate different heart rhythms and arrhythmias: sinus rhythm (SR) at 45, 60, 75, 90 and 120/min, supraventricular arrhythmias (e.g. sinus arrhythmia, atrial fibrillation) and ventricular arrhythmias (e.g. ventricular extrasystoles), pacemaker-ECGs, ST-changes and technical artifacts. The analysis of the image acquisition process was performed on a 64-row multidetector CT (Brilliance, Philips Medical Systems, Cleveland, USA). A prospectively triggered scan protocol as used for routine was applied (120 kV; 150 mA s; 0.4 s rotation and exposure time per scan; image acquisition predominantly in end-diastole at 75% R-R-interval, in arrythmias with a mean heart rate above 80/min in systole at 45% of the R-R-interval; FOV 25 cm). The mean dose length product (DLP) and its percentage increase from baseline (SR at 60/min) were determined. Result: Radiation exposure can increase significantly when the heart rhythm deviates from sinus rhythm. ECG-changes leading to a significant DLP increase (p < 0.05) were bifocal pacemaker (61%), pacemaker dysfunction (22%), SVES (20%), ventricular salvo (20%), and atrial fibrillation (14%). Significantly (p < 0.05) prolonged scan time (>8 s) could be observed in bifocal pacemaker (12.8 s), pacemaker dysfunction (10.7 s), atrial fibrillation (10.3 s) and sinus arrhythmia (9.3 s). Conclusion: In prospectively ECG triggered CT, heart rate and rhythm can provoke different types of scanner performance, which can significantly alter radiation exposure and scan time. These results might have an important implication for indication, informed consent and contrast agent injection protocols.

  1. Virtual Surgery in Congenital Heart Disease

    DEFF Research Database (Denmark)

    Sørensen, Thomas Sangild; Mosegaard, Jesper; Kislinskiy, Stefan

    2014-01-01

    et al., Cardiol Young 13:451–460, 2003). In combination with the availability of virtual models of congenital heart disease (CHD), techniques for computer- based simulation of cardiac interventions have enabled early clinical exploration of the emerging concept of virtual surgery (Sorensen et al...... Teaching, diagnosing, and planning of therapy in patients with complex structural cardiovascular heart disease require profound understanding of the three-dimensional (3D) nature of cardiovascular structures in these patients. To obtain such understanding, modern imaging modalities provide high...

  2. Atomistic computer simulations a practical guide

    CERN Document Server

    Brazdova, Veronika

    2013-01-01

    Many books explain the theory of atomistic computer simulations; this book teaches you how to run them This introductory ""how to"" title enables readers to understand, plan, run, and analyze their own independent atomistic simulations, and decide which method to use and which questions to ask in their research project. It is written in a clear and precise language, focusing on a thorough understanding of the concepts behind the equations and how these are used in the simulations. As a result, readers will learn how to design the computational model and which parameters o

  3. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  4. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Iván Tomás Cotes-Ruiz

    Full Text Available Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS. The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  5. Fel simulations using distributed computing

    NARCIS (Netherlands)

    Einstein, J.; Biedron, S.G.; Freund, H.P.; Milton, S.V.; Van Der Slot, P. J M; Bernabeu, G.

    2016-01-01

    While simulation tools are available and have been used regularly for simulating light sources, including Free-Electron Lasers, the increasing availability and lower cost of accelerated computing opens up new opportunities. This paper highlights a method of how accelerating and parallelizing code

  6. CUBESIM, Hypercube and Denelcor Hep Parallel Computer Simulation

    International Nuclear Information System (INIS)

    Dunigan, T.H.

    1988-01-01

    1 - Description of program or function: CUBESIM is a set of subroutine libraries and programs for the simulation of message-passing parallel computers and shared-memory parallel computers. Subroutines are supplied to simulate the Intel hypercube and the Denelcor HEP parallel computers. The system permits a user to develop and test parallel programs written in C or FORTRAN on a single processor. The user may alter such hypercube parameters as message startup times, packet size, and the computation-to-communication ratio. The simulation generates a trace file that can be used for debugging, performance analysis, or graphical display. 2 - Method of solution: The CUBESIM simulator is linked with the user's parallel application routines to run as a single UNIX process. The simulator library provides a small operating system to perform process and message management. 3 - Restrictions on the complexity of the problem: Up to 128 processors can be simulated with a virtual memory limit of 6 million bytes. Up to 1000 processes can be simulated

  7. Accelerator simulation using computers

    International Nuclear Information System (INIS)

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ''multi-track'' simulation and analysis code can be used for these applications

  8. Computer Simulation in Information and Communication Engineering

    CERN Multimedia

    Anton Topurov

    2005-01-01

    CSICE'05 Sofia, Bulgaria 20th - 22nd October, 2005 On behalf of the International Scientific Committee, we would like to invite you all to Sofia, the capital city of Bulgaria, to the International Conference in Computer Simulation in Information and Communication Engineering CSICE'05. The Conference is aimed at facilitating the exchange of experience in the field of computer simulation gained not only in traditional fields (Communications, Electronics, Physics...) but also in the areas of biomedical engineering, environment, industrial design, etc. The objective of the Conference is to bring together lectures, researchers and practitioners from different countries, working in the fields of computer simulation in information engineering, in order to exchange information and bring new contribution to this important field of engineering design and education. The Conference will bring you the latest ideas and development of the tools for computer simulation directly from their inventors. Contribution describ...

  9. Computational simulation of concurrent engineering for aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  10. Computational simulation for concurrent engineering of aerospace propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  11. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  12. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  13. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  14. The Australian Computational Earth Systems Simulator

    Science.gov (United States)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  15. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  16. Simulation games

    OpenAIRE

    Giddings, S.

    2013-01-01

    This chapter outlines the conventions and pleasures of simulation games as a category, and explores the complicated and contested term simulation. This concept goes to the heart of what computer games and video games are, and the ways in which they articulate ideas, processes, and phenomena between their virtual worlds and the actual world. It has been argued that simulations generate and communicate knowledge and events quite differently from the long-­dominant cultural mode of narrative. Th...

  17. Automatic temperature computation for realistic IR simulation

    Science.gov (United States)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  18. A Multiscale Closed-Loop Cardiovascular Model, with Applications to Heart Pacing and Hemorrhage

    Science.gov (United States)

    Canuto, Daniel; Eldredge, Jeff; Chong, Kwitae; Benharash, Peyman; Dutson, Erik

    2017-11-01

    A computational tool is developed for simulating the dynamic response of the human cardiovascular system to various stressors and injuries. The tool couples zero-dimensional models of the heart, pulmonary vasculature, and peripheral vasculature to one-dimensional models of the major systemic arteries. To simulate autonomic response, this multiscale circulatory model is integrated with a feedback model of the baroreflex, allowing control of heart rate, cardiac contractility, and peripheral impedance. The performance of the tool is demonstrated in two scenarios: increasing heart rate by stimulating the sympathetic nervous system, and an acute 10 percent hemorrhage from the left femoral artery.

  19. Discrete Event Simulation Computers can be used to simulate the ...

    Indian Academy of Sciences (India)

    IAS Admin

    people who use computers every moment of their waking lives, others even ... How is discrete event simulation different from other kinds of simulation? ... time, energy consumption .... Schedule the CustomerDeparture event for this customer.

  20. Utility of screening computed tomography of chest, abdomen and pelvis in patients after heart transplantation

    International Nuclear Information System (INIS)

    Dasari, Tarun W.; Pavlovic-Surjancev, Biljana; Dusek, Linda; Patel, Nilamkumar; Heroux, Alain L.

    2011-01-01

    Introduction: Malignancy is a late cause of mortality in heart transplant recipients. It is unknown if screening computed tomography scan would lead to early detection of such malignancies or serious vascular anomalies post heart transplantation. Methods: This is a single center observational study of patients undergoing surveillance computed tomography of chest, abdomen and pelvis atleast 5 years after transplantation. Abnormal findings, included pulmonary nodules, lymphadenopathy and intra-thoracic and intra-abdominal masses and vascular anomalies such as abdominal aortic aneurysm. The clinical follow up of each of these major abnormal findings is summarized. Results: A total of 63 patients underwent computed tomography scan of chest, abdomen and pelvis at least 5 years after transplantation. Of these, 54 (86%) were male and 9 (14%) were female. Mean age was 52 ± 9.2 years. Computed tomography revealed 1 lung cancer (squamous cell) only. Non specific pulmonary nodules were seen in 6 patients (9.5%). The most common incidental finding was abdominal aortic aneurysms (N = 6 (9.5%)), which necessitated follow up computed tomography (N = 5) or surgery (N = 1). Mean time to detection of abdominal aortic aneurysms from transplantation was 14.6 ± 4.2 years. Mean age at the time of detection of abdominal aortic aneurysms was 74.5 ± 3.2 years. Conclusion: Screening computed tomography scan in patients 5 years from transplantation revealed only one malignancy but lead to increased detection of abdominal aortic aneurysms. Thus the utility is low in terms of detection of malignancy. Based on this study we do not recommend routine computed tomography post heart transplantation.

  1. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  2. Quantum chemistry simulation on quantum computers: theories and experiments.

    Science.gov (United States)

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  3. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  4. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  5. Highway traffic simulation on multi-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hanebutte, U.R.; Doss, E.; Tentner, A.M.

    1997-04-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.

  6. Cardiac re-entry dynamics and self-termination in DT-MRI based model of Human Foetal Heart

    Science.gov (United States)

    Biktasheva, Irina V.; Anderson, Richard A.; Holden, Arun V.; Pervolaraki, Eleftheria; Wen, Fen Cai

    2018-02-01

    The effect of human foetal heart geometry and anisotropy on anatomy induced drift and self-termination of cardiac re-entry is studied here in MRI based 2D slice and 3D whole heart computer simulations. Isotropic and anisotropic models of 20 weeks of gestational age human foetal heart obtained from 100μm voxel diffusion tensor MRI data sets were used in the computer simulations. The fiber orientation angles of the heart were obtained from the orientation of the DT-MRI primary eigenvectors. In a spatially homogeneous electrophysiological monodomain model with the DT-MRI based heart geometries, cardiac re-entry was initiated at a prescribed location in a 2D slice, and in the 3D whole heart anatomy models. Excitation was described by simplified FitzHugh-Nagumo kinetics. In a slice of the heart, with propagation velocity twice as fast along the fibres than across the fibers, DT-MRI based fiber anisotropy changes the re-entry dynamics from pinned to an anatomical re-entry. In the 3D whole heart models, the fiber anisotropy changes cardiac re-entry dynamics from a persistent re-entry to the re-entry self-termination. The self-termination time depends on the re-entry’s initial position. In all the simulations with the DT-MRI based cardiac geometry, the anisotropy of the myocardial tissue shortens the time to re-entry self-termination several folds. The numerical simulations depend on the validity of the DT-MRI data set used. The ventricular wall showed the characteristic transmural rotation of the helix angle of the developed mammalian heart, while the fiber orientation in the atria was irregular.

  7. Alternative energy technologies an introduction with computer simulations

    CERN Document Server

    Buxton, Gavin

    2014-01-01

    Introduction to Alternative Energy SourcesGlobal WarmingPollutionSolar CellsWind PowerBiofuelsHydrogen Production and Fuel CellsIntroduction to Computer ModelingBrief History of Computer SimulationsMotivation and Applications of Computer ModelsUsing Spreadsheets for SimulationsTyping Equations into SpreadsheetsFunctions Available in SpreadsheetsRandom NumbersPlotting DataMacros and ScriptsInterpolation and ExtrapolationNumerical Integration and Diffe

  8. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  9. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  10. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  11. Polymer Composites Corrosive Degradation: A Computational Simulation

    Science.gov (United States)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  12. Composite self-expanding bioresorbable prototype stents with reinforced compression performance for congenital heart disease application: Computational and experimental investigation.

    Science.gov (United States)

    Zhao, Fan; Xue, Wen; Wang, Fujun; Liu, Laijun; Shi, Haoqin; Wang, Lu

    2018-08-01

    Stents are vital devices to treat vascular stenosis in pediatric patients with congenital heart disease. Bioresorbable stents (BRSs) have been applied to reduce challenging complications caused by permanent metal stents. However, it remains almost a total lack of BRSs with satisfactory compression performance specifically for children with congenital heart disease, leading to importantly suboptimal effects. In this work, composite bioresorbable prototype stents with superior compression resistance were designed by braiding and annealing technology, incorporating poly (p-dioxanone) (PPDO) monofilaments and polycaprolactone (PCL) multifilament. Stent prototype compression properties were investigated. The results revealed that novel composite prototype stents showed superior compression force compared to the control ones, as well as recovery ability. Furthermore, deformation mechanisms were analyzed by computational simulation, which revealed bonded interlacing points among yarns play an important role. This research presents important clinical implications in bioresorbable stent manufacture and provides further study with an innovative stent design. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. A kinematic approach for efficient and robust simulation of the cardiac beating motion.

    Directory of Open Access Journals (Sweden)

    Takashi Ijiri

    Full Text Available Computer simulation techniques for cardiac beating motions potentially have many applications and a broad audience. However, most existing methods require enormous computational costs and often show unstable behavior for extreme parameter sets, which interrupts smooth simulation study and make it difficult to apply them to interactive applications. To address this issue, we present an efficient and robust framework for simulating the cardiac beating motion. The global cardiac motion is generated by the accumulation of local myocardial fiber contractions. We compute such local-to-global deformations using a kinematic approach; we divide a heart mesh model into overlapping local regions, contract them independently according to fiber orientation, and compute a global shape that satisfies contracted shapes of all local regions as much as possible. A comparison between our method and a physics-based method showed that our method can generate motion very close to that of a physics-based simulation. Our kinematic method has high controllability; the simulated ventricle-wall-contraction speed can be easily adjusted to that of a real heart by controlling local contraction timing. We demonstrate that our method achieves a highly realistic beating motion of a whole heart in real time on a consumer-level computer. Our method provides an important step to bridge a gap between cardiac simulations and interactive applications.

  14. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  15. Fluid simulation for computer graphics

    CERN Document Server

    Bridson, Robert

    2008-01-01

    Animating fluids like water, smoke, and fire using physics-based simulation is increasingly important in visual effects, in particular in movies, like The Day After Tomorrow, and in computer games. This book provides a practical introduction to fluid simulation for graphics. The focus is on animating fully three-dimensional incompressible flow, from understanding the math and the algorithms to the actual implementation.

  16. Construction of 3D MR image-based computer models of pathologic hearts, augmented with histology and optical fluorescence imaging to characterize action potential propagation.

    Science.gov (United States)

    Pop, Mihaela; Sermesant, Maxime; Liu, Garry; Relan, Jatin; Mansi, Tommaso; Soong, Alan; Peyrat, Jean-Marc; Truong, Michael V; Fefer, Paul; McVeigh, Elliot R; Delingette, Herve; Dick, Alexander J; Ayache, Nicholas; Wright, Graham A

    2012-02-01

    Cardiac computer models can help us understand and predict the propagation of excitation waves (i.e., action potential, AP) in healthy and pathologic hearts. Our broad aim is to develop accurate 3D MR image-based computer models of electrophysiology in large hearts (translatable to clinical applications) and to validate them experimentally. The specific goals of this paper were to match models with maps of the propagation of optical AP on the epicardial surface using large porcine hearts with scars, estimating several parameters relevant to macroscopic reaction-diffusion electrophysiological models. We used voltage-sensitive dyes to image AP in large porcine hearts with scars (three specimens had chronic myocardial infarct, and three had radiofrequency RF acute scars). We first analyzed the main AP waves' characteristics: duration (APD) and propagation under controlled pacing locations and frequencies as recorded from 2D optical images. We further built 3D MR image-based computer models that have information derived from the optical measures, as well as morphologic MRI data (i.e., myocardial anatomy, fiber directions and scar definition). The scar morphology from MR images was validated against corresponding whole-mount histology. We also compared the measured 3D isochronal maps of depolarization to simulated isochrones (the latter replicating precisely the experimental conditions), performing model customization and 3D volumetric adjustments of the local conductivity. Our results demonstrated that mean APD in the border zone (BZ) of the infarct scars was reduced by ~13% (compared to ~318 ms measured in normal zone, NZ), but APD did not change significantly in the thin BZ of the ablation scars. A generic value for velocity ratio (1:2.7) in healthy myocardial tissue was derived from measured values of transverse and longitudinal conduction velocities relative to fibers direction (22 cm/s and 60 cm/s, respectively). The model customization and 3D volumetric

  17. Large-scale simulations of error-prone quantum computation devices

    International Nuclear Information System (INIS)

    Trieu, Doan Binh

    2009-01-01

    The theoretical concepts of quantum computation in the idealized and undisturbed case are well understood. However, in practice, all quantum computation devices do suffer from decoherence effects as well as from operational imprecisions. This work assesses the power of error-prone quantum computation devices using large-scale numerical simulations on parallel supercomputers. We present the Juelich Massively Parallel Ideal Quantum Computer Simulator (JUMPIQCS), that simulates a generic quantum computer on gate level. It comprises an error model for decoherence and operational errors. The robustness of various algorithms in the presence of noise has been analyzed. The simulation results show that for large system sizes and long computations it is imperative to actively correct errors by means of quantum error correction. We implemented the 5-, 7-, and 9-qubit quantum error correction codes. Our simulations confirm that using error-prone correction circuits with non-fault-tolerant quantum error correction will always fail, because more errors are introduced than being corrected. Fault-tolerant methods can overcome this problem, provided that the single qubit error rate is below a certain threshold. We incorporated fault-tolerant quantum error correction techniques into JUMPIQCS using Steane's 7-qubit code and determined this threshold numerically. Using the depolarizing channel as the source of decoherence, we find a threshold error rate of (5.2±0.2) x 10 -6 . For Gaussian distributed operational over-rotations the threshold lies at a standard deviation of 0.0431±0.0002. We can conclude that quantum error correction is especially well suited for the correction of operational imprecisions and systematic over-rotations. For realistic simulations of specific quantum computation devices we need to extend the generic model to dynamic simulations, i.e. time-dependent Hamiltonian simulations of realistic hardware models. We focus on today's most advanced technology, i

  18. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Sophistication of computational science and fundamental physics simulations

    International Nuclear Information System (INIS)

    Ishiguro, Seiji; Ito, Atsushi; Usami, Shunsuke; Ohtani, Hiroaki; Sakagami, Hitoshi; Toida, Mieko; Hasegawa, Hiroki; Horiuchi, Ritoku; Miura, Hideaki

    2016-01-01

    Numerical experimental reactor research project is composed of the following studies: (1) nuclear fusion simulation research with a focus on specific physical phenomena of specific equipment, (2) research on advanced simulation method to increase predictability or expand its application range based on simulation, (3) visualization as the foundation of simulation research, (4) research for advanced computational science such as parallel computing technology, and (5) research aiming at elucidation of fundamental physical phenomena not limited to specific devices. Specifically, a wide range of researches with medium- to long-term perspectives are being developed: (1) virtual reality visualization, (2) upgrading of computational science such as multilayer simulation method, (3) kinetic behavior of plasma blob, (4) extended MHD theory and simulation, (5) basic plasma process such as particle acceleration due to interaction of wave and particle, and (6) research related to laser plasma fusion. This paper reviews the following items: (1) simultaneous visualization in virtual reality space, (2) multilayer simulation of collisionless magnetic reconnection, (3) simulation of microscopic dynamics of plasma coherent structure, (4) Hall MHD simulation of LHD, (5) numerical analysis for extension of MHD equilibrium and stability theory, (6) extended MHD simulation of 2D RT instability, (7) simulation of laser plasma, (8) simulation of shock wave and particle acceleration, and (9) study on simulation of homogeneous isotropic MHD turbulent flow. (A.O.)

  20. Computer Simulation Performed for Columbia Project Cooling System

    Science.gov (United States)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  1. A Review of Freely Available Quantum Computer Simulation Software

    OpenAIRE

    Brandhorst-Satzkorn, Johan

    2012-01-01

    A study has been made of a few different freely available Quantum Computer simulators. All the simulators tested are available online on their respective websites. A number of tests have been performed to compare the different simulators against each other. Some untested simulators of various programming languages are included to show the diversity of the quantum computer simulator applications. The conclusion of the review is that LibQuantum is the best of the simulators tested because of ea...

  2. Computer simulation of liquid crystals

    International Nuclear Information System (INIS)

    McBride, C.

    1999-01-01

    Molecular dynamics simulation performed on modern computer workstations provides a powerful tool for the investigation of the static and dynamic characteristics of liquid crystal phases. In this thesis molecular dynamics computer simulations have been performed for two model systems. Simulations of 4,4'-di-n-pentyl-bibicyclo[2.2.2]octane demonstrate the growth of a structurally ordered phase directly from an isotropic fluid. This is the first time that this has been achieved for an atomistic model. The results demonstrate a strong coupling between orientational ordering and molecular shape, but indicate that the coupling between molecular conformational changes and molecular reorientation is relatively weak. Simulations have also been performed for a hybrid Gay-Berne/Lennard-Jones model resulting in thermodynamically stable nematic and smectic phases. Frank elastic constants have been calculated for the nematic phase formed by the hybrid model through analysis of the fluctuations of the nematic director, giving results comparable with those found experimentally. Work presented in this thesis also describes the parameterization of the torsional potential of a fragment of a dimethyl siloxane polymer chain, disiloxane diol (HOMe 2 Si) 2 O, using ab initio quantum mechanical calculations. (author)

  3. Biomes computed from simulated climatologies

    Energy Technology Data Exchange (ETDEWEB)

    Claussen, M.; Esch, M. [Max-Planck-Institut fuer Meteorologie, Hamburg (Germany)

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  4. Computer security simulation

    International Nuclear Information System (INIS)

    Schelonka, E.P.

    1979-01-01

    Development and application of a series of simulation codes used for computer security analysis and design are described. Boolean relationships for arrays of barriers within functional modules are used to generate composite effectiveness indices. The general case of multiple layers of protection with any specified barrier survival criteria is given. Generalized reduction algorithms provide numerical security indices in selected subcategories and for the system as a whole. 9 figures, 11 tables

  5. Epicardial adipose tissue volume estimation by postmortem computed tomography of eviscerated hearts

    DEFF Research Database (Denmark)

    Hindsø, Louise; Jakobsen, Lykke S; Jacobsen, Christina

    2017-01-01

    Epicardial adipose tissue (EAT) may play a role in the development of coronary artery disease. The purpose of this study was to evaluate a method based on postmortem computed tomography (PMCT) for the estimation of EAT volume. We PMCT-scanned the eviscerated hearts of 144 deceased individuals, wh...

  6. Understanding Islamist political violence through computational social simulation

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, Jennifer H [Los Alamos National Laboratory; Mackerrow, Edward P [Los Alamos National Laboratory; Patelli, Paolo G [Los Alamos National Laboratory; Eberhardt, Ariane [Los Alamos National Laboratory; Stradling, Seth G [Los Alamos National Laboratory

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  7. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  8. REACTOR: a computer simulation for schools

    International Nuclear Information System (INIS)

    Squires, D.

    1985-01-01

    The paper concerns computer simulation of the operation of a nuclear reactor, for use in schools. The project was commissioned by UKAEA, and carried out by the Computers in the Curriculum Project, Chelsea College. The program, for an advanced gas cooled reactor, is briefly described. (U.K.)

  9. Learning and instruction with computer simulations

    NARCIS (Netherlands)

    de Jong, Anthonius J.M.

    1991-01-01

    The present volume presents the results of an inventory of elements of such a computer learning environment. This inventory was conducted within a DELTA project called SIMULATE. In the project a learning environment that provides intelligent support to learners and that has a simulation as its

  10. Computer simulation on molten ionic salts

    International Nuclear Information System (INIS)

    Kawamura, K.; Okada, I.

    1978-01-01

    The extensive advances in computer technology have since made it possible to apply computer simulation to the evaluation of the macroscopic and microscopic properties of molten salts. The evaluation of the potential energy in molten salts systems is complicated by the presence of long-range energy, i.e. Coulomb energy, in contrast to simple liquids where the potential energy is easily evaluated. It has been shown, however, that no difficulties are encountered when the Ewald method is applied to the evaluation of Coulomb energy. After a number of attempts had been made to approximate the pair potential, the Huggins-Mayer potential based on ionic crystals became the most often employed. Since it is thought that the only appreciable contribution to many-body potential, not included in Huggins-Mayer potential, arises from the internal electrostatic polarization of ions in molten ionic salts, computer simulation with a provision for ion polarization has been tried recently. The computations, which are employed mainly for molten alkali halides, can provide: (1) thermodynamic data such as internal energy, internal pressure and isothermal compressibility; (2) microscopic configurational data such as radial distribution functions; (3) transport data such as the diffusion coefficient and electrical conductivity; and (4) spectroscopic data such as the intensity of inelastic scattering and the stretching frequency of simple molecules. The computed results seem to agree well with the measured results. Computer simulation can also be used to test the effectiveness of a proposed pair potential and the adequacy of postulated models of molten salts, and to obtain experimentally inaccessible data. A further application of MD computation employing the pair potential based on an ionic model to BeF 2 , ZnCl 2 and SiO 2 shows the possibility of quantitative interpretation of structures and glass transformation phenomena

  11. Contribution of the Arterial System and the Heart to Blood Pressure during Normal Aging - A Simulation Study.

    Science.gov (United States)

    Maksuti, Elira; Westerhof, Nico; Westerhof, Berend E; Broomé, Michael; Stergiopulos, Nikos

    2016-01-01

    During aging, systolic blood pressure continuously increases over time, whereas diastolic pressure first increases and then slightly decreases after middle age. These pressure changes are usually explained by changes of the arterial system alone (increase in arterial stiffness and vascular resistance). However, we hypothesise that the heart contributes to the age-related blood pressure progression as well. In the present study we quantified the blood pressure changes in normal aging by using a Windkessel model for the arterial system and the time-varying elastance model for the heart, and compared the simulation results with data from the Framingham Heart Study. Parameters representing arterial changes (resistance and stiffness) during aging were based on literature values, whereas parameters representing cardiac changes were computed through physiological rules (compensated hypertrophy and preservation of end-diastolic volume). When taking into account arterial changes only, the systolic and diastolic pressure did not agree well with the population data. Between 20 and 80 years, systolic pressure increased from 100 to 122 mmHg, and diastolic pressure decreased from 76 to 55 mmHg. When taking cardiac adaptations into account as well, systolic and diastolic pressure increased from 100 to 151 mmHg and decreased from 76 to 69 mmHg, respectively. Our results show that not only the arterial system, but also the heart, contributes to the changes in blood pressure during aging. The changes in arterial properties initiate a systolic pressure increase, which in turn initiates a cardiac remodelling process that further augments systolic pressure and mitigates the decrease in diastolic pressure.

  12. New Pedagogies on Teaching Science with Computer Simulations

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  13. Effectiveness of simulator-based echocardiography training of noncardiologists in congenital heart diseases.

    Science.gov (United States)

    Wagner, Robert; Razek, Vit; Gräfe, Florentine; Berlage, Thomas; Janoušek, Jan; Daehnert, Ingo; Weidenbach, Michael

    2013-07-01

    Congenital heart diseases (CHD) are responsible for substantial morbidity and mortality in neonates. The preliminary diagnosis often is made by noncardiologists. For this reason, there is a huge demand of training in echocardiography of CHD. This is difficult to achieve due to limited resources of specialized centers. The goal of this study was to investigate the training effect of the echocardiography simulator EchoCom on trainee's ability to diagnose CHD. We enrolled 10 residents for simulator-based training in echocardiography of CHD. All participants were instructed on the simulator's basic handling and had one hour to scan the first 9 datasets information (ventricular septal defect, atrial septal defect, atrioventricular septal defect, Tetralogy of Fallot, transposition of great arteries, congenital corrected transposition of great arteries, common arterial trunk, hypoplastic left heart syndrome, normal anatomy) and establish a diagnosis. No help was given except for support regarding simulator related issues. Afterward, 2 rounds of structured simulator based echocardiography training focused on echocardiographic anatomy, spatial orientation, standard views, and echocardiographic anatomy of different CHD followed. All participants completed a standardized questionnaire containing 10 multiple-choice (MC) questions focusing on basic theoretical knowledge in echocardiographic anatomy and common CHD. Almost all of the residents invited from the affiliated children's hospital had little (20%) or no experience (80%) in echocardiography of CHD. Their Pretest and Posttest scores showed significant improvement for both, MC test and performance test, respectively. Our study showed that simulator-based training in echocardiography in CHD could be very effective and may assist with training outside the scope of CHD. © 2013, Wiley Periodicals, Inc.

  14. Large-scale simulations of error-prone quantum computation devices

    Energy Technology Data Exchange (ETDEWEB)

    Trieu, Doan Binh

    2009-07-01

    The theoretical concepts of quantum computation in the idealized and undisturbed case are well understood. However, in practice, all quantum computation devices do suffer from decoherence effects as well as from operational imprecisions. This work assesses the power of error-prone quantum computation devices using large-scale numerical simulations on parallel supercomputers. We present the Juelich Massively Parallel Ideal Quantum Computer Simulator (JUMPIQCS), that simulates a generic quantum computer on gate level. It comprises an error model for decoherence and operational errors. The robustness of various algorithms in the presence of noise has been analyzed. The simulation results show that for large system sizes and long computations it is imperative to actively correct errors by means of quantum error correction. We implemented the 5-, 7-, and 9-qubit quantum error correction codes. Our simulations confirm that using error-prone correction circuits with non-fault-tolerant quantum error correction will always fail, because more errors are introduced than being corrected. Fault-tolerant methods can overcome this problem, provided that the single qubit error rate is below a certain threshold. We incorporated fault-tolerant quantum error correction techniques into JUMPIQCS using Steane's 7-qubit code and determined this threshold numerically. Using the depolarizing channel as the source of decoherence, we find a threshold error rate of (5.2{+-}0.2) x 10{sup -6}. For Gaussian distributed operational over-rotations the threshold lies at a standard deviation of 0.0431{+-}0.0002. We can conclude that quantum error correction is especially well suited for the correction of operational imprecisions and systematic over-rotations. For realistic simulations of specific quantum computation devices we need to extend the generic model to dynamic simulations, i.e. time-dependent Hamiltonian simulations of realistic hardware models. We focus on today's most advanced

  15. Interoceanic canal excavation scheduling via computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Baldonado, Orlino C [Holmes and Narver, Inc., Los Angeles, CA (United States)

    1970-05-15

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  16. Interoceanic canal excavation scheduling via computer simulation

    International Nuclear Information System (INIS)

    Baldonado, Orlino C.

    1970-01-01

    The computer simulation language GPSS/360 was used to simulate the schedule of several nuclear detonation programs for the interoceanic canal project. The effects of using different weather restriction categories due to air blast and fallout were investigated. The effect of increasing the number of emplacement and stemming crews and the effect of varying the reentry period after detonating a row charge or salvo were also studied. Detonation programs were simulated for the proposed Routes 17A and 25E. The study demonstrates the method of using computer simulation so that a schedule and its associated constraints can be assessed for feasibility. Since many simulation runs can be made for a given set of detonation program constraints, one readily obtains an average schedule for a range of conditions. This provides a method for analyzing time-sensitive operations so that time and cost-effective operational schedules can be established. A comparison of the simulated schedules with those that were published shows them to be similar. (author)

  17. Biomes computed from simulated climatologies

    Energy Technology Data Exchange (ETDEWEB)

    Claussen, W.; Esch, M.

    1992-09-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study is undertaken in order to show the advantage of this biome model in comprehensively diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rain fall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential North-East shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favorable for the existence of certain biomes, not as a prediction of a future distribution of biomes. (orig.).

  18. Computer graphics in heat-transfer simulations

    International Nuclear Information System (INIS)

    Hamlin, G.A. Jr.

    1980-01-01

    Computer graphics can be very useful in the setup of heat transfer simulations and in the display of the results of such simulations. The potential use of recently available low-cost graphics devices in the setup of such simulations has not been fully exploited. Several types of graphics devices and their potential usefulness are discussed, and some configurations of graphics equipment are presented in the low-, medium-, and high-price ranges

  19. An original piecewise model for computing energy expenditure from accelerometer and heart rate signals.

    Science.gov (United States)

    Romero-Ugalde, Hector M; Garnotel, M; Doron, M; Jallon, P; Charpentier, G; Franc, S; Huneker, E; Simon, C; Bonnet, S

    2017-07-28

    Activity energy expenditure (EE) plays an important role in healthcare, therefore, accurate EE measures are required. Currently available reference EE acquisition methods, such as doubly labeled water and indirect calorimetry, are complex, expensive, uncomfortable, and/or difficult to apply on real time. To overcome these drawbacks, the goal of this paper is to propose a model for computing EE in real time (minute-by-minute) from heart rate and accelerometer signals. The proposed model, which consists of an original branched model, uses heart rate signals for computing EE on moderate to vigorous physical activities and a linear combination of heart rate and counts per minute for computing EE on light to moderate physical activities. Model parameters were estimated from a given data set composed of 53 subjects performing 25 different physical activities (light-, moderate- and vigorous-intensity), and validated using leave-one-subject-out. A different database (semi-controlled in-city circuit), was used in order to validate the versatility of the proposed model. Comparisons are done versus linear and nonlinear models, which are also used for computing EE from accelerometer and/or HR signals. The proposed piecewise model leads to more accurate EE estimations ([Formula: see text], [Formula: see text] and [Formula: see text] J kg -1 min -1 and [Formula: see text], [Formula: see text], and [Formula: see text] J kg -1 min -1 on each validation database). This original approach, which is more conformable and less expensive than the reference methods, allows accurate EE estimations, in real time (minute-by-minute), during a large variety of physical activities. Therefore, this model may be used on applications such as computing the time that a given subject spent on light-intensity physical activities and on moderate to vigorous physical activities (binary classification accuracy of 0.8155).

  20. Parallel Computing for Brain Simulation.

    Science.gov (United States)

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Computational Dehydration of Crystalline Hydrates Using Molecular Dynamics Simulations

    DEFF Research Database (Denmark)

    Larsen, Anders Støttrup; Rantanen, Jukka; Johansson, Kristoffer E

    2017-01-01

    Molecular dynamics (MD) simulations have evolved to an increasingly reliable and accessible technique and are today implemented in many areas of biomedical sciences. We present a generally applicable method to study dehydration of hydrates based on MD simulations and apply this approach...... to the dehydration of ampicillin trihydrate. The crystallographic unit cell of the trihydrate is used to construct the simulation cell containing 216 ampicillin and 648 water molecules. This system is dehydrated by removing water molecules during a 2200 ps simulation, and depending on the computational dehydration....... The structural changes could be followed in real time, and in addition, an intermediate amorphous phase was identified. The computationally identified dehydrated structure (anhydrate) was slightly different from the experimentally known anhydrate structure suggesting that the simulated computational structure...

  2. The effect of metaprolol alone and metaprolol plus bromazepam on heart rate and heart rate variability during multislice computed tomography angiography

    International Nuclear Information System (INIS)

    Tuyyab, F.; Naeem, M.Y.; Maken, G.R.; Najfi, M.H.; Hassan, F.

    2012-01-01

    Objective: The purpose of this study was to determine the effect of metaprolol alone and metaprolol plus bromazepam on heart rate and heart rate variability during multi slice computed tomography (MSCT) angiography. Methodology: This was a Double blind randomized controlled trial was conducted at AFIC/NIHD, Rawalpindi, from May 2011 to November 2011. Patients undergoing first MSCT angiography meeting inclusion criteria with heart rates (HR) more than 80 beats/min were included. Patients were randomized in to two groups using random numbers table. Group 1 was administered metaprolol plus placebo while group 2 was administered metaprolol plus bromazepam one hour before the scan. Both groups had scans under strictly similar conditions. HR before and during scan along with heart rate variability (HRV) were recorded. Results: A total of 80 patients were included. Patients mean age was 49 + 13, 57 % were males while 43 % were females. Risk factor profile was similar in both groups. HR reduction in group 1 was 15+ 6.0 and in group 2, was 21+9.0 (p= 0.002). HRV in group 1 was 3.9 + 1.32 and in group 2 was 2.3 + 1.0 (p= 0.003). Group 2 had significantly lower HR and significantly less HRV as compared with group 1. Conclusion: Combination of bromazepam and metaprolol results in significant and further reduction in heart rate and heart rate variability than metaprolol alone. Both drugs can be used together for a better control of heart rate and heart rate variability during MSCT angiography for improving the quality of images. (author)

  3. Lung and heart dose volume analyses with CT simulator in radiation treatment of breast cancer

    International Nuclear Information System (INIS)

    Das, Indra J.; Cheng, Elizabeth C.; Freedman, Gary; Fowble, Barbara

    1998-01-01

    Purpose: Radiation pneumonitis and cardiac effects are directly related to the irradiated lung and heart volumes in the treatment fields. The central lung distance (CLD) from a tangential breast radiograph is shown to be a significant indicator of ipsilateral irradiated lung volume. Retrospective analysis of the pattern of dose volume of lung and heart with actual volume data from a CT simulator in the treatment of breast cancer is presented with respect to CLD. Methods and Materials: The heart and lung volumes in the tangential treatment fields were analyzed in 108 consecutive cases (52 left and 56 right breast) referred for CT simulation. All patients in this study were immobilized and placed on an inclined breast board in actual treatment setup. Both arms were stretched over head to avoid collision with the scanner aperture. Radiopaque marks were placed on the medial and lateral borders of the tangential fields. All patients were scanned in spiral mode with slice width and thickness of 3 mm each, respectively. The lung and heart structures as well as irradiated areas were delineated on each slice and respective volumes were accurately measured. The treatment beam parameters were recorded and the digitally reconstructed radiographs (DRRs) were generated for the measurement of the CLD and analysis. Results: Using CT data the mean volume and standard deviation of left and right lungs were 1307.7 ± 297.7 cm 3 and 1529.6 ± 298.5 cm 3 , respectively. The magnitude of irradiated volume in left and right lung is nearly equal for the same CLD that produces different percent irradiated volumes (PIV). The left and right PIV lungs are 8.3 ± 4.7% and 6.6 ± 3.7%, respectively. The PIV data have shown to correlate with CLD with second- and third-degree polynomials; however, in this study a simple straight line regression is used to provide better confidence than the higher order polynomials. The regression lines for the left and right breasts are very different based on

  4. Computer simulation of thermal plant operations

    CERN Document Server

    O'Kelly, Peter

    2012-01-01

    This book describes thermal plant simulation, that is, dynamic simulation of plants which produce, exchange and otherwise utilize heat as their working medium. Directed at chemical, mechanical and control engineers involved with operations, control and optimization and operator training, the book gives the mathematical formulation and use of simulation models of the equipment and systems typically found in these industries. The author has adopted a fundamental approach to the subject. The initial chapters provide an overview of simulation concepts and describe a suitable computer environment.

  5. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  6. Quantum simulations with noisy quantum computers

    Science.gov (United States)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  7. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  8. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  9. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  10. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  11. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    Science.gov (United States)

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  12. Evaluation of a focussed protocol for hand-held echocardiography and computer-assisted auscultation in detecting latent rheumatic heart disease in scholars.

    Science.gov (United States)

    Zühlke, Liesl J; Engel, Mark E; Nkepu, Simpiwe; Mayosi, Bongani M

    2016-08-01

    Introduction Echocardiography is the diagnostic test of choice for latent rheumatic heart disease. The utility of echocardiography for large-scale screening is limited by high cost, complex diagnostic protocols, and time to acquire multiple images. We evaluated the performance of a brief hand-held echocardiography protocol and computer-assisted auscultation in detecting latent rheumatic heart disease with or without pathological murmur. A total of 27 asymptomatic patients with latent rheumatic heart disease based on the World Heart Federation criteria and 66 healthy controls were examined by standard cardiac auscultation to detect pathological murmur. Hand-held echocardiography using a focussed protocol that utilises one view - that is, the parasternal long-axis view - and one measurement - that is, mitral regurgitant jet - and a computer-assisted auscultation utilising an automated decision tool were performed on all patients. The sensitivity and specificity of computer-assisted auscultation in latent rheumatic heart disease were 4% (95% CI 1.0-20.4%) and 93.7% (95% CI 84.5-98.3%), respectively. The sensitivity and specificity of the focussed hand-held echocardiography protocol for definite rheumatic heart disease were 92.3% (95% CI 63.9-99.8%) and 100%, respectively. The test reliability of hand-held echocardiography was 98.7% for definite and 94.7% for borderline disease, and the adjusted diagnostic odds ratios were 1041 and 263.9 for definite and borderline disease, respectively. Computer-assisted auscultation has extremely low sensitivity but high specificity for pathological murmur in latent rheumatic heart disease. Focussed hand-held echocardiography has fair sensitivity but high specificity and diagnostic utility for definite or borderline rheumatic heart disease in asymptomatic patients.

  13. Investigation of attenuation correction in SPECT using textural features, Monte Carlo simulations, and computational anthropomorphic models.

    Science.gov (United States)

    Spirou, Spiridon V; Papadimitroulas, Panagiotis; Liakou, Paraskevi; Georgoulias, Panagiotis; Loudos, George

    2015-09-01

    To present and evaluate a new methodology to investigate the effect of attenuation correction (AC) in single-photon emission computed tomography (SPECT) using textural features analysis, Monte Carlo techniques, and a computational anthropomorphic model. The GATE Monte Carlo toolkit was used to simulate SPECT experiments using the XCAT computational anthropomorphic model, filled with a realistic biodistribution of (99m)Tc-N-DBODC. The simulated gamma camera was the Siemens ECAM Dual-Head, equipped with a parallel hole lead collimator, with an image resolution of 3.54 × 3.54 mm(2). Thirty-six equispaced camera positions, spanning a full 360° arc, were simulated. Projections were calculated after applying a ± 20% energy window or after eliminating all scattered photons. The activity of the radioisotope was reconstructed using the MLEM algorithm. Photon attenuation was accounted for by calculating the radiological pathlength in a perpendicular line from the center of each voxel to the gamma camera. Twenty-two textural features were calculated on each slice, with and without AC, using 16 and 64 gray levels. A mask was used to identify only those pixels that belonged to each organ. Twelve of the 22 features showed almost no dependence on AC, irrespective of the organ involved. In both the heart and the liver, the mean and SD were the features most affected by AC. In the liver, six features were affected by AC only on some slices. Depending on the slice, skewness decreased by 22-34% with AC, kurtosis by 35-50%, long-run emphasis mean by 71-91%, and long-run emphasis range by 62-95%. In contrast, gray-level non-uniformity mean increased by 78-218% compared with the value without AC and run percentage mean by 51-159%. These results were not affected by the number of gray levels (16 vs. 64) or the data used for reconstruction: with the energy window or without scattered photons. The mean and SD were the main features affected by AC. In the heart, no other feature was

  14. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  15. Computer Simulation of a Hardwood Processing Plant

    Science.gov (United States)

    D. Earl Kline; Philip A. Araman

    1990-01-01

    The overall purpose of this paper is to introduce computer simulation as a decision support tool that can be used to provide managers with timely information. A simulation/animation modeling procedure is demonstrated for wood products manufacuring systems. Simulation modeling techniques are used to assist in identifying and solving problems. Animation is used for...

  16. Simulation of a pulsatile total artificial heart: Development of a partitioned Fluid Structure Interaction model

    Science.gov (United States)

    Sonntag, Simon J.; Kaufmann, Tim A. S.; Büsen, Martin R.; Laumen, Marco; Linde, Torsten; Schmitz-Rode, Thomas; Steinseifer, Ulrich

    2013-04-01

    Heart disease is one of the leading causes of death in the world. Due to a shortage in donor organs artificial hearts can be a bridge to transplantation or even serve as a destination therapy for patients with terminal heart insufficiency. A pusher plate driven pulsatile membrane pump, the Total Artificial Heart (TAH) ReinHeart, is currently under development at the Institute of Applied Medical Engineering of RWTH Aachen University.This paper presents the methodology of a fully coupled three-dimensional time-dependent Fluid Structure Interaction (FSI) simulation of the TAH using a commercial partitioned block-Gauss-Seidel coupling package. Partitioned coupling of the incompressible fluid with the slender flexible membrane as well as a high fluid/structure density ratio of about unity led inherently to a deterioration of the stability (‘artificial added mass instability’). The objective was to conduct a stable simulation with high accuracy of the pumping process. In order to achieve stability, a combined resistance and pressure outlet boundary condition as well as the interface artificial compressibility method was applied. An analysis of the contact algorithm and turbulence condition is presented. Independence tests are performed for the structural and the fluid mesh, the time step size and the number of pulse cycles. Because of the large deformation of the fluid domain, a variable mesh stiffness depending on certain mesh properties was specified for the fluid elements. Adaptive remeshing was avoided. Different approaches for the mesh stiffness function are compared with respect to convergence, preservation of mesh topology and mesh quality. The resulting mesh aspect ratios, mesh expansion factors and mesh orthogonalities are evaluated in detail. The membrane motion and flow distribution of the coupled simulations are compared with a top-view recording and stereo Particle Image Velocimetry (PIV) measurements, respectively, of the actual pump.

  17. Computed Tomography of Prosthetic Heart Valves

    NARCIS (Netherlands)

    Habets, J.

    2012-01-01

    Prosthetic heart valve (PHV) dysfunction is an infrequent but potentially life-threatening disease with a heterogeneous clinical presentation. Patients with PHV dysfunction clinically can present with symptoms of congestive heart failure (dyspnea, fatigue, edema), fever, angina pectoris, dizziness

  18. Radiographic Evaluation of Valvular Heart Disease With Computed Tomography and Magnetic Resonance Correlation.

    Science.gov (United States)

    Lempel, Jason K; Bolen, Michael A; Renapurkar, Rahul D; Azok, Joseph T; White, Charles S

    2016-09-01

    Valvular heart disease is a group of complex entities with varying etiologies and clinical presentations. There are a number of imaging tools available to supplement clinical evaluation of suspected valvular heart disease, with echocardiography being the most common and clinically established, and more recent emergence of computed tomography and magnetic resonance imaging as additional supportive techniques. Yet even with these newer and more sophisticated modalities, chest radiography remains one of the earliest and most common diagnostic examinations performed during the triage of patients with suspected cardiac dysfunction. Recognizing the anatomic and pathologic features of cardiac radiography including the heart's adaptation to varying hemodynamic changes can provide clues to the radiologist regarding the underlying etiology. In this article, we will elucidate several principles relating to chamber modifications in response to pressure and volume overload as well as radiographic appearances associated with pulmonary fluid status and cardiac dysfunction. We will also present a pattern approach to optimize analysis of the chest radiograph for valvular heart disease, which will help guide the radiologist down a differential diagnostic pathway and create a more meaningful clinical report.

  19. Interferences and events on epistemic shifts in physics through computer simulations

    CERN Document Server

    Warnke, Martin

    2017-01-01

    Computer simulations are omnipresent media in today's knowledge production. For scientific endeavors such as the detection of gravitational waves and the exploration of subatomic worlds, simulations are essential; however, the epistemic status of computer simulations is rather controversial as they are neither just theory nor just experiment. Therefore, computer simulations have challenged well-established insights and common scientific practices as well as our very understanding of knowledge. This volume contributes to the ongoing discussion on the epistemic position of computer simulations in a variety of physical disciplines, such as quantum optics, quantum mechanics, and computational physics. Originating from an interdisciplinary event, it shows that accounts of contemporary physics can constructively interfere with media theory, philosophy, and the history of science.

  20. Computed radiography simulation using the Monte Carlo code MCNPX

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.

    2009-01-01

    Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)

  1. Computed radiography simulation using the Monte Carlo code MCNPX

    Energy Technology Data Exchange (ETDEWEB)

    Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)

    2010-09-15

    Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.

  2. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  3. Computational simulation in architectural and environmental acoustics methods and applications of wave-based computation

    CERN Document Server

    Sakamoto, Shinichi; Otsuru, Toru

    2014-01-01

    This book reviews a variety of methods for wave-based acoustic simulation and recent applications to architectural and environmental acoustic problems. Following an introduction providing an overview of computational simulation of sound environment, the book is in two parts: four chapters on methods and four chapters on applications. The first part explains the fundamentals and advanced techniques for three popular methods, namely, the finite-difference time-domain method, the finite element method, and the boundary element method, as well as alternative time-domain methods. The second part demonstrates various applications to room acoustics simulation, noise propagation simulation, acoustic property simulation for building components, and auralization. This book is a valuable reference that covers the state of the art in computational simulation for architectural and environmental acoustics.  

  4. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  5. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  6. Inovation of the computer system for the WWER-440 simulator

    International Nuclear Information System (INIS)

    Schrumpf, L.

    1988-01-01

    The configuration of the WWER-440 simulator computer system consists of four SMEP computers. The basic data processing unit consists of two interlinked SM 52/11.M1 computers with 1 MB of main memory. This part of the computer system of the simulator controls the operation of the entire simulator, processes the programs of technology behavior simulation, of the unit information system and of other special systems, guarantees program support and the operation of the instructor's console. An SM 52/11 computer with 256 kB of main memory is connected to each unit. It is used as a communication unit for data transmission using the DASIO 600 interface. Semigraphic color displays are based on the microprocessor modules of the SM 50/40 and SM 53/10 kit supplemented with a modified TESLA COLOR 110 ST tv receiver. (J.B.). 1 fig

  7. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  8. Assessment of adult congenital heart disease with multi-detector computed tomography - beyond coronary lumenography

    International Nuclear Information System (INIS)

    Nicol, E.D.; Gatzoulis, M.; Padley, S.P.G.; Rubens, M.

    2007-01-01

    Adult congenital heart disease is an increasingly prevalent condition with more than 135,000 patients affected in England alone. With this increased patient population and an increase in interventional procedures being performed on them, traditional imaging techniques such as cardiac magnetic resonance (CMR) may be unavailable locally or contra-indicated. Cardiac multidetector computed tomography (MDCT) is rapidly emerging as an alternative imaging method for the investigation of these patients and this review highlights the broad application of cardiac MDCT to this population and makes recommendations on the stardardized reporting of complex congenital heart disease

  9. Assessment of adult congenital heart disease with multi-detector computed tomography - beyond coronary lumenography

    Energy Technology Data Exchange (ETDEWEB)

    Nicol, E.D. [Department of Radiology, Royal Brompton Hospital, London (United Kingdom) and Department of Cardiology, Royal Brompton Hospital, London (United Kingdom)]. E-mail: e.nicol@rbht.nhs.uk; Gatzoulis, M. [Adult Congenital Heart Centre and Centre for Pulmonary Hypertension, Royal Brompton Hospital and National Heart and Lung Institute, London (United Kingdom); Padley, S.P.G. [Department of Radiology, Royal Brompton Hospital, London (United Kingdom); Rubens, M. [Department of Radiology, Royal Brompton Hospital, London (United Kingdom)

    2007-06-15

    Adult congenital heart disease is an increasingly prevalent condition with more than 135,000 patients affected in England alone. With this increased patient population and an increase in interventional procedures being performed on them, traditional imaging techniques such as cardiac magnetic resonance (CMR) may be unavailable locally or contra-indicated. Cardiac multidetector computed tomography (MDCT) is rapidly emerging as an alternative imaging method for the investigation of these patients and this review highlights the broad application of cardiac MDCT to this population and makes recommendations on the stardardized reporting of complex congenital heart disease.

  10. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    Science.gov (United States)

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  11. Computer simulation in cell radiobiology

    International Nuclear Information System (INIS)

    Yakovlev, A.Y.; Zorin, A.V.

    1988-01-01

    This research monograph demonstrates the possible ways of using stochastic simulation for exploring cell kinetics, emphasizing the effects of cell radiobiology. In vitro kinetics of normal and irradiated cells is the main subject, but some approaches to the simulation of controlled cell systems are considered as well: the epithelium of the small intestine in mice taken as a case in point. Of particular interest is the evaluation of simulation modelling as a tool for gaining insight into biological processes and hence the new inferences from concrete experimental data, concerning regularities in cell population response to irradiation. The book is intended to stimulate interest among computer science specialists in developing new, more efficient means for the simulation of cell systems and to help radiobiologists in interpreting the experimental data

  12. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  13. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  14. Simulation of biological ion channels with technology computer-aided design.

    Science.gov (United States)

    Pandey, Santosh; Bortei-Doku, Akwete; White, Marvin H

    2007-01-01

    Computer simulations of realistic ion channel structures have always been challenging and a subject of rigorous study. Simulations based on continuum electrostatics have proven to be computationally cheap and reasonably accurate in predicting a channel's behavior. In this paper we discuss the use of a device simulator, SILVACO, to build a solid-state model for KcsA channel and study its steady-state response. SILVACO is a well-established program, typically used by electrical engineers to simulate the process flow and electrical characteristics of solid-state devices. By employing this simulation program, we have presented an alternative computing platform for performing ion channel simulations, besides the known methods of writing codes in programming languages. With the ease of varying the different parameters in the channel's vestibule and the ability of incorporating surface charges, we have shown the wide-ranging possibilities of using a device simulator for ion channel simulations. Our simulated results closely agree with the experimental data, validating our model.

  15. Computational algorithms for simulations in atmospheric optics.

    Science.gov (United States)

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  16. SiMon: Simulation Monitor for Computational Astrophysics

    Science.gov (United States)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  17. Measurement of lung density in congestive heart failure by computed tomography

    International Nuclear Information System (INIS)

    Nomura, Masanori; Miyagi, Yutaka; Tachi, Keiji; Sakabe, Yoshiyuki; Sakai, Yasuhiko; Hishida, Hitoshi; Mizuno, Yasushi; Sasaki, Fumio; Koga, Sukehiko

    1984-01-01

    The computed tomography (CT) number within the region of interest (ROI) was used as a parameter to assess lung density in patients with congestive heart failure. Thirty-eight patients with valvular heart disease (VHD) and 34 patients with ischemic heart disease (IHD) were studied. Based on the New York Heart Association (NYHA) classification, 24 VHD patients were in class I or II (VHD I-II) and the other 14 were in NYHA class III or IV (VHD III-IV). Eighteen patients with IHD were in NYHA class I or II (IHD I-II) and 16 were in class III or IV (IHD III-IV). The CT number was measured bilaterally at the upper, middle and lower levels of the chest and compared with the corresponding value in 21 normal subjects (Group N). In a preliminary study on Group N, the CT numbers were insensitive to the size of the ROI, but were closely related to its location. In clinical applications, the mean values of the CT numbers in all six lung fields increased in the order of IHD I-II, to VHD I-II, IHD III-IV and VHD III-IV. Except for patients in IHD I-II, they were significantly larger than in Group N. The relationship between the CT number and the systolic and mean pulmonary arterial pressures and the pulmonary capillary wedge pressure were evaluated in 36 patients. Significant correlations were obtained in all six lung fields (r=0.65-0.78, p<0.001). The results suggest that measurement of lung density by CT is useful for the quantitative evaluation of the severity of disease in patients with congestive heart failure. (author)

  18. Measurement of lung density in congestive heart failure by computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nomura, Masanori; Miyagi, Yutaka; Tachi, Keiji; Sakabe, Yoshiyuki; Sakai, Yasuhiko; Hishida, Hitoshi; Mizuno, Yasushi; Sasaki, Fumio; Koga, Sukehiko [Fujita-Gakuen Health Univ., Toyoake, Aichi (Japan)

    1984-11-01

    The computed tomography (CT) number within the region of interest (ROI) was used as a parameter to assess lung density in patients with congestive heart failure. Thirty-eight patients with valvular heart disease (VHD) and 34 patients with ischemic heart disease (IHD) were studied. Based on the New York Heart Association (NYHA) classification, 24 VHD patients were in class I or II (VHD I-II) and the other 14 were in NYHA class III or IV (VHD III-IV). Eighteen patients with IHD were in NYHA class I or II (IHD I-II) and 16 were in class III or IV (IHD III-IV). The CT number was measured bilaterally at the upper, middle and lower levels of the chest and compared with the corresponding value in 21 normal subjects (Group N). In a preliminary study on Group N, the CT numbers were insensitive to the size of the ROI, but were closely related to its location. In clinical applications, the mean values of the CT numbers in all six lung fields increased in the order of IHD I-II, to VHD I-II, IHD III-IV and VHD III-IV. Except for patients in IHD I-II, they were significantly larger than in Group N. The relationship between the CT number and the systolic and mean pulmonary arterial pressures and the pulmonary capillary wedge pressure were evaluated in 36 patients. Significant correlations were obtained in all six lung fields (r=0.65-0.78, p < 0.001). The results suggest that measurement of lung density by CT is useful for the quantitative evaluation of the severity of disease in patients with congestive heart failure.

  19. Computer Simulation of Diffraction Patterns.

    Science.gov (United States)

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  20. Simulation-based Mastery Learning Improves Cardiac Auscultation Skills in Medical Students

    Science.gov (United States)

    McGaghie, William C.; Cohen, Elaine R.; Kaye, Marsha; Wayne, Diane B.

    2010-01-01

    Background Cardiac auscultation is a core clinical skill. However, prior studies show that trainee skills are often deficient and that clinical experience is not a proxy for competence. Objective To describe a mastery model of cardiac auscultation education and evaluate its effectiveness in improving bedside cardiac auscultation skills. Design Untreated control group design with pretest and posttest. Participants Third-year students who received a cardiac auscultation curriculum and fourth year students who did not. Intervention A cardiac auscultation curriculum consisting of a computer tutorial and a cardiac patient simulator. All third-year students were required to meet or exceed a minimum passing score (MPS) set by an expert panel at posttest. Measurements Diagnostic accuracy with simulated heart sounds and actual patients. Results Trained third-year students (n = 77) demonstrated significantly higher cardiac auscultation accuracy compared to untrained fourth year students (n = 31) in assessment of simulated heart sounds (93.8% vs. 73.9%, p auscultation curriculum consisting of deliberate practice with a computer-based tutorial and a cardiac patient simulator resulted in improved assessment of simulated heart sounds and more accurate examination of actual patients. PMID:20339952

  1. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  2. Measuring irradiated lung and heart area in breast tangential fields using a simulator-based computerized tomography device

    International Nuclear Information System (INIS)

    Mallik, Raj; Fowler, Allan; Hunt, Peter

    1995-01-01

    Purpose: To illustrate the use of a simulator based computerized tomography system (SIMCT) in the simulation and planning of tangential breast fields. Methods and Materials: Forty-five consecutive patients underwent treatment planning using a radiotherapy simulator with computerized tomography attachment. One to three scans were obtained for each patient, calculations were made on the central axis scan. Due to the wide aperture of this system all patients were able to be scanned in the desired treatment position with arm abducted 90 deg. . Using available software tools the area of lung and/or heart included within the tangential fields was calculated. The greatest perpendicular distance (GPD) from the chest wall to posterior field edge was also measured. Results: The mean GPD for the group was 25.40 mm with 71% of patients having GPDs of ≤ 30 mm. The mean area of irradiated lung was 1780 sq mm which represented 18.0% of the total ipsilateral lung area seen in the central axis. Seven of the patients with left sided tumors had an average 1314 sq mm heart irradiated in the central axis. This represented 11.9% of total heart area in these patients. Conclusion: Measurements of irradiated lung and heart area can be easily and accurately made using a SIMCT device. Such measurements may help identify those patients potentially at risk for lung or heart toxicity as a consequence of their treatment. A major advantage of this device is the ability to scan patients in the actual treatment position

  3. Measuring irradiated lung and heart area in breast tangential fields using a simulator-based computerized tomography device

    Energy Technology Data Exchange (ETDEWEB)

    Mallik, Raj; Fowler, Allan; Hunt, Peter

    1995-01-15

    Purpose: To illustrate the use of a simulator based computerized tomography system (SIMCT) in the simulation and planning of tangential breast fields. Methods and Materials: Forty-five consecutive patients underwent treatment planning using a radiotherapy simulator with computerized tomography attachment. One to three scans were obtained for each patient, calculations were made on the central axis scan. Due to the wide aperture of this system all patients were able to be scanned in the desired treatment position with arm abducted 90 deg. . Using available software tools the area of lung and/or heart included within the tangential fields was calculated. The greatest perpendicular distance (GPD) from the chest wall to posterior field edge was also measured. Results: The mean GPD for the group was 25.40 mm with 71% of patients having GPDs of {<=} 30 mm. The mean area of irradiated lung was 1780 sq mm which represented 18.0% of the total ipsilateral lung area seen in the central axis. Seven of the patients with left sided tumors had an average 1314 sq mm heart irradiated in the central axis. This represented 11.9% of total heart area in these patients. Conclusion: Measurements of irradiated lung and heart area can be easily and accurately made using a SIMCT device. Such measurements may help identify those patients potentially at risk for lung or heart toxicity as a consequence of their treatment. A major advantage of this device is the ability to scan patients in the actual treatment position.

  4. A Computational Framework for Efficient Low Temperature Plasma Simulations

    Science.gov (United States)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  5. Computational fluid dynamics modelling of left valvular heart diseases during atrial fibrillation

    Directory of Open Access Journals (Sweden)

    Stefania Scarsoglio

    2016-07-01

    Full Text Available Background: Although atrial fibrillation (AF, a common arrhythmia, frequently presents in patients with underlying valvular disease, its hemodynamic contributions are not fully understood. The present work aimed to computationally study how physical conditions imposed by pathologic valvular anatomy act on AF hemodynamics. Methods: We simulated AF with different severity grades of left-sided valvular diseases and compared the cardiovascular effects that they exert during AF, compared to lone AF. The fluid dynamics model used here has been recently validated for lone AF and relies on a lumped parameterization of the four heart chambers, together with the systemic and pulmonary circulation. The AF modelling involves: (i irregular, uncorrelated and faster heart rate; (ii atrial contractility dysfunction. Three different grades of severity (mild, moderate, severe were analyzed for each of the four valvulopathies (AS, aortic stenosis, MS, mitral stenosis, AR, aortic regurgitation, MR, mitral regurgitation, by varying–through the valve opening angle–the valve area. Results: Regurgitation was hemodynamically more relevant than stenosis, as the latter led to inefficient cardiac flow, while the former introduced more drastic fluid dynamics variation. Moreover, mitral valvulopathies were more significant than aortic ones. In case of aortic valve diseases, proper mitral functioning damps out changes at atrial and pulmonary levels. In the case of mitral valvulopathy, the mitral valve lost its regulating capability, thus hemodynamic variations almost equally affected regions upstream and downstream of the valve. In particular, the present study revealed that both mitral and aortic regurgitation strongly affect hemodynamics, followed by mitral stenosis, while aortic stenosis has the least impact among the analyzed valvular diseases. Discussion: The proposed approach can provide new mechanistic insights as to which valvular pathologies merit more aggressive

  6. Use of computer graphics simulation for teaching of flexible sigmoidoscopy.

    Science.gov (United States)

    Baillie, J; Jowell, P; Evangelou, H; Bickel, W; Cotton, P

    1991-05-01

    The concept of simulation training in endoscopy is now well-established. The systems currently under development employ either computer graphics simulation or interactive video technology; each has its strengths and weaknesses. A flexible sigmoidoscopy training device has been designed which uses graphic routines--such as object oriented programming and double buffering--in entirely new ways. These programming techniques compensate for the limitations of currently available desk-top microcomputers. By boosting existing computer 'horsepower' with next generation coprocessors and sophisticated graphics tools such as intensity interpolation (Gouraud shading), the realism of computer simulation of flexible sigmoidoscopy is being greatly enhanced. The computer program has teaching and scoring capabilities, making it a truly interactive system. Use has been made of this ability to record, grade and store each trainee encounter in computer memory as part of a multi-center, prospective trial of simulation training being conducted currently in the USA. A new input device, a dummy endoscope, has been designed that allows application of variable resistance to the insertion tube. This greatly enhances tactile feedback, such as resistance during looping. If carefully designed trials show that computer simulation is an attractive and effective training tool, it is expected that this technology will evolve rapidly and be made widely available to trainee endoscopists.

  7. Effect of computer game playing on baseline laparoscopic simulator skills.

    Science.gov (United States)

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  8. Noise simulation in cone beam CT imaging with parallel computing

    International Nuclear Information System (INIS)

    Tu, S.-J.; Shaw, Chris C; Chen, Lingyun

    2006-01-01

    We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation

  9. The Simulation and Analysis of the Closed Die Hot Forging Process by A Computer Simulation Method

    Directory of Open Access Journals (Sweden)

    Dipakkumar Gohil

    2012-06-01

    Full Text Available The objective of this research work is to study the variation of various parameters such as stress, strain, temperature, force, etc. during the closed die hot forging process. A computer simulation modeling approach has been adopted to transform the theoretical aspects in to a computer algorithm which would be used to simulate and analyze the closed die hot forging process. For the purpose of process study, the entire deformation process has been divided in to finite number of steps appropriately and then the output values have been computed at each deformation step. The results of simulation have been graphically represented and suitable corrective measures are also recommended, if the simulation results do not agree with the theoretical values. This computer simulation approach would significantly improve the productivity and reduce the energy consumption of the overall process for the components which are manufactured by the closed die forging process and contribute towards the efforts in reducing the global warming.

  10. Prototyping and Simulating Parallel, Distributed Computations with VISA

    National Research Council Canada - National Science Library

    Demeure, Isabelle M; Nutt, Gary J

    1989-01-01

    ...] to support the design, prototyping, and simulation of parallel, distributed computations. In particular, VISA is meant to guide the choice of partitioning and communication strategies for such computations, based on their performance...

  11. Slab cooling system design using computer simulation

    NARCIS (Netherlands)

    Lain, M.; Zmrhal, V.; Drkal, F.; Hensen, J.L.M.

    2007-01-01

    For a new technical library building in Prague computer simulations were carried out to help design of slab cooling system and optimize capacity of chillers. In the paper is presented concept of new technical library HVAC system, the model of the building, results of the energy simulations for

  12. Biocellion: accelerating computer simulation of multicellular biological system models.

    Science.gov (United States)

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. How Many Times Should One Run a Computational Simulation?

    DEFF Research Database (Denmark)

    Seri, Raffaello; Secchi, Davide

    2017-01-01

    This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces statisti......This chapter is an attempt to answer the question “how many runs of a computational simulation should one do,” and it gives an answer by means of statistical analysis. After defining the nature of the problem and which types of simulation are mostly affected by it, the article introduces...

  14. Computer simulation of gear tooth manufacturing processes

    Science.gov (United States)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  15. Multidetector-row computed tomography coronary angiography. Optimization of image reconstruction phase according to the heart rate

    International Nuclear Information System (INIS)

    Nagatani, Yukihiro; Takahashi, Masashi; Takazakura, Ryutaro; Nitta, Norihisa; Murata, Kiyoshi; Ushio, Noritoshi; Matsuo, Shinro; Yamamoto, Takashi; Horie, Minoru

    2007-01-01

    The purpose of this study was to optimize the image reconstruction phase of multidetector-row computed tomography (MDCT) coronary angiography according to the heart rate is crucial. Scan data were reconstructed for 10 different phases in 58 sequential patients who under went 8-row cardiac MDCT. The obtained images were scored and compared in terms of motion artifacts and visibility of the vessels, and moreover, electrocardiogram (ECG) record-based evaluations were added for clarification of the temporal relationships among these 10 phases. In the cases with lower heart rates ( 65 beats/mm), they were obtained in the late systolic period. As the heart rate increased, the optimal image reconstruction phase changed from mid diastole to late systole. However, it is recommended to try to decrease the heart rate of patients before data acquisition. (author)

  16. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

    Science.gov (United States)

    Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

    2017-01-08

    Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

  17. A novel left heart simulator for the multi-modality characterization of native mitral valve geometry and fluid mechanics.

    Science.gov (United States)

    Rabbah, Jean-Pierre; Saikrishnan, Neelakantan; Yoganathan, Ajit P

    2013-02-01

    Numerical models of the mitral valve have been used to elucidate mitral valve function and mechanics. These models have evolved from simple two-dimensional approximations to complex three-dimensional fully coupled fluid structure interaction models. However, to date these models lack direct one-to-one experimental validation. As computational solvers vary considerably, experimental benchmark data are critically important to ensure model accuracy. In this study, a novel left heart simulator was designed specifically for the validation of numerical mitral valve models. Several distinct experimental techniques were collectively performed to resolve mitral valve geometry and hemodynamics. In particular, micro-computed tomography was used to obtain accurate and high-resolution (39 μm voxel) native valvular anatomy, which included the mitral leaflets, chordae tendinae, and papillary muscles. Three-dimensional echocardiography was used to obtain systolic leaflet geometry. Stereoscopic digital particle image velocimetry provided all three components of fluid velocity through the mitral valve, resolved every 25 ms in the cardiac cycle. A strong central filling jet (V ~ 0.6 m/s) was observed during peak systole with minimal out-of-plane velocities. In addition, physiologic hemodynamic boundary conditions were defined and all data were synchronously acquired through a central trigger. Finally, the simulator is a precisely controlled environment, in which flow conditions and geometry can be systematically prescribed and resultant valvular function and hemodynamics assessed. Thus, this work represents the first comprehensive database of high fidelity experimental data, critical for extensive validation of mitral valve fluid structure interaction simulations.

  18. Lung and heart dose volume analyses with CT simulator in tangential field irradiation of breast cancer

    International Nuclear Information System (INIS)

    Das, Indra J.; Cheng, Elizabeth C.; Fowble, Barbara

    1997-01-01

    Objective: Radiation pneumonitis and cardiac effects are directly related to the irradiated lung and heart volumes in the treatment fields. The central lung distance (CLD) from a tangential breast radiograph is shown to be a significant indicator of ipsilateral irradiated lung volume based on empirically derived functions which accuracy depends on the actual measured volume in treatment position. A simple and accurate linear relationship with CLD and retrospective analysis of the pattern of dose volume of lung and heart is presented with actual volume data from a CT simulator in the treatment of breast cancer. Materials and Methods: The heart and lung volumes in the tangential treatment fields were analyzed in 45 consecutive (22 left and 23 right breast) patients referred for CT simulation of the cone down treatment. All patients in this study were immobilized and placed on an inclined breast board in actual treatment setup. Both arms were stretched over head uniformly to avoid collision with the scanner aperture. Radiopaque marks were placed on the medial and lateral borders of the tangential fields. All patients were scanned in spiral mode with slice width and thickness of 3 mm each, respectively. The lung and heart structures as well as irradiated areas were delineated on each slice and respective volumes were accurately measured. The treatment beam parameters were recorded and the digitally reconstructed radiographs (DRRs) were generated for the CLD and analysis. Results: Table 1 shows the volume statistics of patients in this study. There is a large variation in the lung and heart volumes among patients. Due to differences in the shape of right and left lungs the percent irradiated volume (PIV) are different. The PIV data have shown to correlate with CLD with 2nd and 3rd degree polynomials; however, in this study a simple straight line regression is used to provide better confidence than the higher order polynomial. The regression lines for the left and right

  19. SCI-Clone/32 - a distributed real time simulation system

    International Nuclear Information System (INIS)

    Wilks, C.F.

    1986-01-01

    Advances in engineering and in particular digital computers has enabled the simulation manufacturers to deliver a realism of a kind undreamt of a decade ago. 32-bit computers ranging in processor power from several hundred thousand instructions per second to many millions are at the heart of each simulator. Gould has pioneered digital computers in simulation with real time systems using shared memory, parallel processors, 64KByte cache, and shadow memory. The market is planning for higher iteration rates, lower life cycle costs, and the development of part task products. These can be met by distributing the tasks amongst nodal computers having a unique architecture for sharing data variables with minimal contention. (Auth.)

  20. Initial quality performance results using a phantom to simulate chest computed radiography

    Directory of Open Access Journals (Sweden)

    Muhogora Wilbroad

    2011-01-01

    Full Text Available The aim of this study was to develop a homemade phantom for quantitative quality control in chest computed radiography (CR. The phantom was constructed from copper, aluminium, and polymenthylmethacrylate (PMMA plates as well as Styrofoam materials. Depending on combinations, the literature suggests that these materials can simulate the attenuation and scattering characteristics of lung, heart, and mediastinum. The lung, heart, and mediastinum regions were simulated by 10 mm x 10 mm x 0.5 mm, 10 mm x 10 mm x 0.5 mm and 10 mm x 10 mm x 1 mm copper plates, respectively. A test object of 100 mm x 100 mm and 0.2 mm thick copper was positioned to each region for CNR measurements. The phantom was exposed to x-rays generated by different tube potentials that covered settings in clinical use: 110-120 kVp (HVL=4.26-4.66 mm Al at a source image distance (SID of 180 cm. An approach similar to the recommended method in digital mammography was applied to determine the CNR values of phantom images produced by a Kodak CR 850A system with post-processing turned off. Subjective contrast-detail studies were also carried out by using images of Leeds TOR CDR test object acquired under similar exposure conditions as during CNR measurements. For clinical kVp conditions relevant to chest radiography, the CNR was highest over 90-100 kVp range. The CNR data correlated with the results of contrast detail observations. The values of clinical tube potentials at which CNR is the highest are regarded to be optimal kVp settings. The simplicity in phantom construction can offer easy implementation of related quality control program.

  1. The visual simulators for architecture and computer organization learning

    OpenAIRE

    Nikolić Boško; Grbanović Nenad; Đorđević Jovan

    2009-01-01

    The paper proposes a method of an effective distance learning of architecture and computer organization. The proposed method is based on a software system that is possible to be applied in any course in this field. Within this system students are enabled to observe simulation of already created computer systems. The system provides creation and simulation of switch systems, too.

  2. Programme for the simulation of the TPA-i 1001 computer on the CDC-1604-A computer

    International Nuclear Information System (INIS)

    Belyaev, A.V.

    1976-01-01

    The basic features and capacities of the program simulating the 1001 TPA-i computer with the help of CDC-1604-A are described. The program is essentially aimed at translation of programs in the SLAHG language for the TPA-type computers. The basic part of the program simulates the work of the central TPA processor. This subprogram consequently performs the actions changing in the necessary manner the registers and memory states of the TPA computer. The simulated TPA computer has subprograms-analogous of external devices, i.e. the ASR-33 teletype, the FS 1501 tape reader, and the FACIT perforator. Work according to the program takes 1.65 - 2 times less time as against the work with TPA with the minimum set of external equipment [ru

  3. Large scale particle simulations in a virtual memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Million, R.; Wagner, J.S.; Tajima, T.

    1983-01-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceeds the computer core size. The required address space is automatically mapped onto slow disc memory the the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Assesses to slow memory significantly reduce the excecution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time. (orig.)

  4. Large-scale particle simulations in a virtual-memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Wagner, J.S.; Tajima, T.; Million, R.

    1982-08-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceed the computer core size. The required address space is automatically mapped onto slow disc memory by the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Accesses to slow memory significantly reduce the execution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time

  5. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    Science.gov (United States)

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  6. The effect of heart rate and contractility on the measurement of left ventricular mass by 201Tl SPECT

    International Nuclear Information System (INIS)

    Machac, J.; Vaquer, R.; Levin, H.; Horowitz, S.F.; Mount Sinai Medical Center, New York

    1987-01-01

    Left ventricular myocardial mass can be measured by 201 Tl SPECT, but the effects of changes in heart rate and contractility have not been determined. We constructed a dynamic computer model simulating the contracting left ventricle. Thirty two summed static views at each of 3 heart rates and 3 ejection fractions were manufactured to simulate a 180 0 acquisition. Each image set underwent tomographic reconstruction. Left ventricular mass was measured at a fixed percent threshold in each slice. The results show that left ventricular mass varied little with heart rate (4%) and only slightly more (8%) with ejection fraction. Thus, in the normal clinical setting, left ventricular mass measurements by SPECT are minimally affected by the dynamic state of the heart. (orig.)

  7. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  8. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    Science.gov (United States)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  9. A Computer Simulation of Community Pharmacy Practice for Educational Use.

    Science.gov (United States)

    Bindoff, Ivan; Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

    2014-11-15

    To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor.

  10. Seventeenth Workshop on Computer Simulation Studies in Condensed-Matter Physics

    CERN Document Server

    Landau, David P; Schütler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVI

    2006-01-01

    This status report features the most recent developments in the field, spanning a wide range of topical areas in the computer simulation of condensed matter/materials physics. Both established and new topics are included, ranging from the statistical mechanics of classical magnetic spin models to electronic structure calculations, quantum simulations, and simulations of soft condensed matter. The book presents new physical results as well as novel methods of simulation and data analysis. Highlights of this volume include various aspects of non-equilibrium statistical mechanics, studies of properties of real materials using both classical model simulations and electronic structure calculations, and the use of computer simulations in teaching.

  11. Parallel Monte Carlo simulations on an ARC-enabled computing grid

    International Nuclear Information System (INIS)

    Nilsen, Jon K; Samset, Bjørn H

    2011-01-01

    Grid computing opens new possibilities for running heavy Monte Carlo simulations of physical systems in parallel. The presentation gives an overview of GaMPI, a system for running an MPI-based random walker simulation on grid resources. Integrating the ARC middleware and the new storage system Chelonia with the Ganga grid job submission and control system, we show that MPI jobs can be run on a world-wide computing grid with good performance and promising scaling properties. Results for relatively communication-heavy Monte Carlo simulations run on multiple heterogeneous, ARC-enabled computing clusters in several countries are presented.

  12. Computer simulation in nuclear science and engineering

    International Nuclear Information System (INIS)

    Akiyama, Mamoru; Miya, Kenzo; Iwata, Shuichi; Yagawa, Genki; Kondo, Shusuke; Hoshino, Tsutomu; Shimizu, Akinao; Takahashi, Hiroshi; Nakagawa, Masatoshi.

    1992-01-01

    The numerical simulation technology used for the design of nuclear reactors includes the scientific fields of wide range, and is the cultivated technology which grew in the steady efforts to high calculation accuracy through safety examination, reliability verification test, the assessment of operation results and so on. Taking the opportunity of putting numerical simulation to practical use in wide fields, the numerical simulation of five basic equations which describe the natural world and the progress of its related technologies are reviewed. It is expected that numerical simulation technology contributes to not only the means of design study but also the progress of science and technology such as the construction of new innovative concept, the exploration of new mechanisms and substances, of which the models do not exist in the natural world. The development of atomic energy and the progress of computers, Boltzmann's transport equation and its periphery, Navier-Stokes' equation and its periphery, Maxwell's electromagnetic field equation and its periphery, Schroedinger wave equation and its periphery, computational solid mechanics and its periphery, and probabilistic risk assessment and its periphery are described. (K.I.)

  13. Computational fluid dynamics simulations of light water reactor flows

    International Nuclear Information System (INIS)

    Tzanos, C.P.; Weber, D.P.

    1999-01-01

    Advances in computational fluid dynamics (CFD), turbulence simulation, and parallel computing have made feasible the development of three-dimensional (3-D) single-phase and two-phase flow CFD codes that can simulate fluid flow and heat transfer in realistic reactor geometries with significantly reduced reliance, especially in single phase, on empirical correlations. The objective of this work was to assess the predictive power and computational efficiency of a CFD code in the analysis of a challenging single-phase light water reactor problem, as well as to identify areas where further improvements are needed

  14. COMPUTER LEARNING SIMULATOR WITH VIRTUAL REALITY FOR OPHTHALMOLOGY

    Directory of Open Access Journals (Sweden)

    Valeria V. Gribova

    2013-01-01

    Full Text Available A toolset of a medical computer learning simulator for ophthalmology with virtual reality and its implementation are considered in the paper. The simulator is oriented for professional skills training for students of medical universities. 

  15. Simulation in computer forensics teaching: the student experience

    OpenAIRE

    Crellin, Jonathan; Adda, Mo; Duke-Williams, Emma; Chandler, Jane

    2011-01-01

    The use of simulation in teaching computing is well established, with digital forensic investigation being a subject area where the range of simulation required is both wide and varied demanding a corresponding breadth of fidelity. Each type of simulation can be complex and expensive to set up resulting in students having only limited opportunities to participate and learn from the simulation. For example students' participation in mock trials in the University mock courtroom or in simulation...

  16. Molecular dynamics simulations and applications in computational toxicology and nanotoxicology.

    Science.gov (United States)

    Selvaraj, Chandrabose; Sakkiah, Sugunadevi; Tong, Weida; Hong, Huixiao

    2018-02-01

    Nanotoxicology studies toxicity of nanomaterials and has been widely applied in biomedical researches to explore toxicity of various biological systems. Investigating biological systems through in vivo and in vitro methods is expensive and time taking. Therefore, computational toxicology, a multi-discipline field that utilizes computational power and algorithms to examine toxicology of biological systems, has gained attractions to scientists. Molecular dynamics (MD) simulations of biomolecules such as proteins and DNA are popular for understanding of interactions between biological systems and chemicals in computational toxicology. In this paper, we review MD simulation methods, protocol for running MD simulations and their applications in studies of toxicity and nanotechnology. We also briefly summarize some popular software tools for execution of MD simulations. Published by Elsevier Ltd.

  17. Computer simulation as representation of knowledge in education

    International Nuclear Information System (INIS)

    Krekic, Valerija Pinter; Namestovski, Zolt

    2009-01-01

    According to Aebli's operative method (1963) and Bruner's (1974) theory of representation the development of the process of thinking in teaching has the following phases - levels of abstraction: manipulation with specific things (specific phase), iconic representation (figural phase), symbolic representation (symbolic phase). Modern information technology has contributed to the enrichment of teaching and learning processes, especially in the fields of natural sciences and mathematics and those of production and technology. Simulation appears as a new possibility in the representation of knowledge. According to Guetzkow (1972) simulation is an operative representation of reality from a relevant aspect. It is about a model of an objective system, which is dynamic in itself. If that model is material it is a simple simulation, if it is abstract it is a reflective experiment, that is a computer simulation. This present work deals with the systematization and classification of simulation methods in the teaching of natural sciences and mathematics and of production and technology with special retrospective view on computer simulations and exemplar representation of the place and the role of this modern method of cognition. Key words: Representation of knowledge, modeling, simulation, education

  18. Computer simulations of shear thickening of concentrated dispersions

    NARCIS (Netherlands)

    Boersma, W.H.; Laven, J.; Stein, H.N.

    1995-01-01

    Stokesian dynamics computer simulations were performed on monolayers of equally sized spheres. The influence of repulsive and attractive forces on the rheological behavior and on the microstructure were studied. Under specific conditions shear thickening could be observed in the simulations, usually

  19. Computational fluid dynamics simulations and validations of results

    CSIR Research Space (South Africa)

    Sitek, MA

    2013-09-01

    Full Text Available Wind flow influence on a high-rise building is analyzed. The research covers full-scale tests, wind-tunnel experiments and numerical simulations. In the present paper computational model used in simulations is described and the results, which were...

  20. Augmented Reality Simulations on Handheld Computers

    Science.gov (United States)

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  1. Computer Simulation of the Circulation Subsystem of a Library

    Science.gov (United States)

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  2. Using EDUCache Simulator for the Computer Architecture and Organization Course

    Directory of Open Access Journals (Sweden)

    Sasko Ristov

    2013-07-01

    Full Text Available The computer architecture and organization course is essential in all computer science and engineering programs, and the most selected and liked elective course for related engineering disciplines. However, the attractiveness brings a new challenge, it requires a lot of effort by the instructor, to explain rather complicated concepts to beginners or to those who study related disciplines. The usage of visual simulators can improve both the teaching and learning processes. The overall goal is twofold: 1~to enable a visual environment to explain the basic concepts and 2~to increase the student's willingness and ability to learn the material.A lot of visual simulators have been used for the computer architecture and organization course. However, due to the lack of visual simulators for simulation of the cache memory concepts, we have developed a new visual simulator EDUCache simulator. In this paper we present that it can be effectively and efficiently used as a supporting tool in the learning process of modern multi-layer, multi-cache and multi-core multi-processors.EDUCache's features enable an environment for performance evaluation and engineering of software systems, i.e. the students will also understand the importance of computer architecture building parts and hopefully, will increase their curiosity for hardware courses in general.

  3. Correlation of radiation dose and heart rate in dual-source computed tomography coronary angiography.

    Science.gov (United States)

    Laspas, Fotios; Tsantioti, Dimitra; Roussakis, Arkadios; Kritikos, Nikolaos; Efthimiadou, Roxani; Kehagias, Dimitrios; Andreou, John

    2011-04-01

    Computed tomography coronary angiography (CTCA) has been widely used since the introduction of 64-slice scanners and dual-source CT technology, but the relatively high radiation dose remains a major concern. To evaluate the relationship between radiation exposure and heart rate (HR), in dual-source CTCA. Data from 218 CTCA examinations, performed with a dual-source 64-slices scanner, were statistically evaluated. Effective radiation dose, expressed in mSv, was calculated as the product of the dose-length product (DLP) times a conversion coefficient for the chest (mSv = DLPx0.017). Heart rate range and mean heart rate, expressed in beats per minute (bpm) of each individual during CTCA, were also provided by the system. Statistical analysis of effective dose and heart rate data was performed by using Pearson correlation coefficient and two-sample t-test. Mean HR and effective dose were found to have a borderline positive relationship. Individuals with a mean HR >65 bpm observed to receive a statistically significant higher effective dose as compared to those with a mean HR ≤65 bpm. Moreover, a strong correlation between effective dose and variability of HR of more than 20 bpm was observed. Dual-source CT scanners are considered to have the capability to provide diagnostic examinations even with high HR and arrhythmias. However, it is desirable to keep the mean heart rate below 65 bpm and heart rate fluctuation less than 20 bpm in order to reduce the radiation exposure.

  4. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  5. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Carolyn L., E-mail: wangcl@uw.edu [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Schopp, Jennifer G.; Kani, Kimia [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States); Petscavage-Thomas, Jonelle M. [Penn State Hershey Medical Center, Department of Radiology, 500 University Drive, Hershey, PA 17033 (United States); Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H. [Department of Radiology, University of Washington, Box 357115, 1959 NE Pacific Street, Seattle, WA 98195-7115 (United States)

    2013-12-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation.

  6. Prospective randomized study of contrast reaction management curricula: Computer-based interactive simulation versus high-fidelity hands-on simulation

    International Nuclear Information System (INIS)

    Wang, Carolyn L.; Schopp, Jennifer G.; Kani, Kimia; Petscavage-Thomas, Jonelle M.; Zaidi, Sadaf; Hippe, Dan S.; Paladin, Angelisa M.; Bush, William H.

    2013-01-01

    Purpose: We developed a computer-based interactive simulation program for teaching contrast reaction management to radiology trainees and compared its effectiveness to high-fidelity hands-on simulation training. Materials and methods: IRB approved HIPAA compliant prospective study of 44 radiology residents, fellows and faculty who were randomized into either the high-fidelity hands-on simulation group or computer-based simulation group. All participants took separate written tests prior to and immediately after their intervention. Four months later participants took a delayed written test and a hands-on high-fidelity severe contrast reaction scenario performance test graded on predefined critical actions. Results: There was no statistically significant difference between the computer and hands-on groups’ written pretest, immediate post-test, or delayed post-test scores (p > 0.6 for all). Both groups’ scores improved immediately following the intervention (p < 0.001). The delayed test scores 4 months later were still significantly higher than the pre-test scores (p ≤ 0.02). The computer group's performance was similar to the hands-on group on the severe contrast reaction simulation scenario test (p = 0.7). There were also no significant differences between the computer and hands-on groups in performance on the individual core competencies of contrast reaction management during the contrast reaction scenario. Conclusion: It is feasible to develop a computer-based interactive simulation program to teach contrast reaction management. Trainees that underwent computer-based simulation training scored similarly on written tests and on a hands-on high-fidelity severe contrast reaction scenario performance test as those trained with hands-on high-fidelity simulation

  7. Computational Particle Dynamic Simulations on Multicore Processors (CPDMu) Final Report Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Schmalz, Mark S

    2011-07-24

    Statement of Problem - Department of Energy has many legacy codes for simulation of computational particle dynamics and computational fluid dynamics applications that are designed to run on sequential processors and are not easily parallelized. Emerging high-performance computing architectures employ massively parallel multicore architectures (e.g., graphics processing units) to increase throughput. Parallelization of legacy simulation codes is a high priority, to achieve compatibility, efficiency, accuracy, and extensibility. General Statement of Solution - A legacy simulation application designed for implementation on mainly-sequential processors has been represented as a graph G. Mathematical transformations, applied to G, produce a graph representation {und G} for a high-performance architecture. Key computational and data movement kernels of the application were analyzed/optimized for parallel execution using the mapping G {yields} {und G}, which can be performed semi-automatically. This approach is widely applicable to many types of high-performance computing systems, such as graphics processing units or clusters comprised of nodes that contain one or more such units. Phase I Accomplishments - Phase I research decomposed/profiled computational particle dynamics simulation code for rocket fuel combustion into low and high computational cost regions (respectively, mainly sequential and mainly parallel kernels), with analysis of space and time complexity. Using the research team's expertise in algorithm-to-architecture mappings, the high-cost kernels were transformed, parallelized, and implemented on Nvidia Fermi GPUs. Measured speedups (GPU with respect to single-core CPU) were approximately 20-32X for realistic model parameters, without final optimization. Error analysis showed no loss of computational accuracy. Commercial Applications and Other Benefits - The proposed research will constitute a breakthrough in solution of problems related to efficient

  8. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  9. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    Science.gov (United States)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  10. Using computer simulations to probe the structure and dynamics of biopolymers

    International Nuclear Information System (INIS)

    Levy, R.M.; Hirata, F.; Kim, K.; Zhang, P.

    1987-01-01

    The use of computer simulations to study internal motions and thermodynamic properties is receiving increased attention. One important use of the method is to provide a more fundamental understanding of the molecular information contained in various kinds of experiments on these complex systems. In the first part of this paper the authors review recent work in their laboratory concerned with the use of computer simulations for the interpretation of experimental probes of molecular structure and dynamics of proteins and nucleic acids. The interplay between computer simulations and three experimental techniques is emphasized: (1) nuclear magnetic resonance relaxation spectroscopy, (2) refinement of macro-molecular x-ray structures, and (3) vibrational spectroscopy. The treatment of solvent effects in biopolymer simulations is a difficult problem. It is not possible to study systematically the effect of solvent conditions, e.g. added salt concentration, on biopolymer properties by means of simulations alone. In the last part of the paper the authors review a more analytical approach they developed to study polyelectrolyte properties of solvated biopolymers. The results are compared with computer simulations

  11. Digital control computer upgrade at the Cernavoda NPP simulator

    International Nuclear Information System (INIS)

    Ionescu, T.

    2006-01-01

    The Plant Process Computer equips some Nuclear Power Plants, like CANDU-600, with Centralized Control performed by an assembly of two computers known as Digital Control Computers (DCC) and working in parallel for safely driving of the plan at steady state and during normal maneuvers but also during abnormal transients when the plant is automatically steered to a safe state. The Centralized Control means both hardware and software with obligatory presence in the frame of the Full Scope Simulator and subject to changing its configuration with specific requirements during the plant and simulator life and covered by this subsection

  12. Dilatation of the heart on postmortem computed tomography (PMCT). Comparison with live CT

    Energy Technology Data Exchange (ETDEWEB)

    Shiotani, Seiji; Kohno, Mototsugu; Ohashi, Noriyoshi; Yamazaki, Kentaro; Nakayama, Hidetsugu; Watanabe, Ko [Tsukuba Medical Center Hospital, Ibaraki (Japan); Itai, Yuji [Tsukuba Univ., Ibaraki (Japan). Inst. of Clinical Medicine

    2003-02-01

    The purpose of this study was to delineate cardiac structures on postmortem computed tomography (PMCT) and quantitatively to prove dilatation of the heart after death. Our subjects were 50 PMCT of non-traumatic deaths and 50 CT of living persons (live CT). We measured maximal and minimal diameters of the superior vena cava (SVC) at three levels (upper, middle, and lower), the inferior vena cava (IVC), pulmonary artery (PA), pulmonary vein (PV), right atrium (RA), and left atrium (LA). Then the product of maximal by minimal diameter and the eccentricity were calculated. The maximal and minimal diameters of the heart were significantly longer than those on live CT except for the maximal diameter of the SVC at the upper level and the maximal diameter of the PA. All of the products of maximal by minimal diameter on PMCT were significantly larger than those on live CT. All of the eccentricities decreased significantly after death except LA. The heart is dilated on PMCT, and the right side of it dilates toward a round shape. (author)

  13. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    Bowden, R.S.M.; Hacking, D.

    1978-01-01

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  14. Computer simulation games in population and education.

    Science.gov (United States)

    Moreland, R S

    1988-01-01

    Computer-based simulation games are effective training tools that have several advantages. They enable players to learn in a nonthreatening manner and develop strategies to achieve goals in a dynamic environment. They also provide visual feedback on the effects of players' decisions, encourage players to explore and experiment with options before making final decisions, and develop players' skills in analysis, decision making, and cooperation. 2 games have been developed by the Research Triangle Institute for public-sector planning agencies interested in or dealing with developing countries. The UN Population and Development Game teaches players about the interaction between population variables and the national economy and how population policies complement other national policies, such as education. The BRIDGES Education Planning Game focuses on the effects education has on national policies. In both games, the computer simulates the reactions of a fictional country's socioeconomic system to players' decisions. Players can change decisions after seeing their effects on a computer screen and thus can improve their performance in achieving goals.

  15. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    Science.gov (United States)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  16. A note on simulated annealing to computer laboratory scheduling ...

    African Journals Online (AJOL)

    The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...

  17. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  18. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    Science.gov (United States)

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  19. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  20. Faster quantum chemistry simulation on fault-tolerant quantum computers

    International Nuclear Information System (INIS)

    Cody Jones, N; McMahon, Peter L; Yamamoto, Yoshihisa; Whitfield, James D; Yung, Man-Hong; Aspuru-Guzik, Alán; Van Meter, Rodney

    2012-01-01

    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource. To improve execution time, we examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay–Kitaev algorithm (Dawson and Nielsen 2006 Quantum Inform. Comput. 6 81). For a given approximation error ϵ, arbitrary single-qubit gates can be produced fault-tolerantly and using a restricted set of gates in time which is O(log ϵ) or O(log log ϵ); with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for lithium hydride. (paper)

  1. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  2. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  3. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  4. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  5. Computer simulation of high energy displacement cascades

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1990-01-01

    A methodology developed for modeling many aspects of high energy displacement cascades with molecular level computer simulations is reviewed. The initial damage state is modeled in the binary collision approximation (using the MARLOWE computer code), and the subsequent disposition of the defects within a cascade is modeled with a Monte Carlo annealing simulation (the ALSOME code). There are few adjustable parameters, and none are set to physically unreasonable values. The basic configurations of the simulated high energy cascades in copper, i.e., the number, size and shape of damage regions, compare well with observations, as do the measured numbers of residual defects and the fractions of freely migrating defects. The success of these simulations is somewhat remarkable, given the relatively simple models of defects and their interactions that are employed. The reason for this success is that the behavior of the defects is very strongly influenced by their initial spatial distributions, which the binary collision approximation adequately models. The MARLOWE/ALSOME system, with input from molecular dynamics and experiments, provides a framework for investigating the influence of high energy cascades on microstructure evolution. (author)

  6. The advanced computational testing and simulation toolkit (ACTS)

    International Nuclear Information System (INIS)

    Drummond, L.A.; Marques, O.

    2002-01-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  7. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  8. Effects of a System Thinking-Based Simulation Program for Congestive Heart Failure.

    Science.gov (United States)

    Kim, Hyeon-Young; Yun, Eun Kyoung

    2018-03-01

    This study evaluated a system thinking-based simulation program for the care of patients with congestive heart failure. Participants were 67 undergraduate nursing students from a nursing college in Seoul, South Korea. The experimental group was given a 4-hour system-thinking program and a 2-hour simulation program, whereas the control group had a 4-hour case study and a 2-hour simulation program. There were significant improvements in critical thinking in both groups, but no significant group differences between educational methods (F = 3.26, P = .076). Problem-solving ability in the experimental group was significantly higher than in the control group (F = 5.04, P = .028). Clinical competency skills in the experimental group were higher than in the control group (t = 2.12, P = .038). A system thinking-based simulation program is a more effective learning method in terms of problem-solving ability and clinical competency skills compared to the existing simulation program. Further research using a longitudinal study is needed to test the long-term effect of the intervention and apply it to the nursing curriculum.

  9. The Use of Computer Simulation Gaming in Teaching Broadcast Economics.

    Science.gov (United States)

    Mancuso, Louis C.

    The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…

  10. SPINET: A Parallel Computing Approach to Spine Simulations

    Directory of Open Access Journals (Sweden)

    Peter G. Kropf

    1996-01-01

    Full Text Available Research in scientitic programming enables us to realize more and more complex applications, and on the other hand, application-driven demands on computing methods and power are continuously growing. Therefore, interdisciplinary approaches become more widely used. The interdisciplinary SPINET project presented in this article applies modern scientific computing tools to biomechanical simulations: parallel computing and symbolic and modern functional programming. The target application is the human spine. Simulations of the spine help us to investigate and better understand the mechanisms of back pain and spinal injury. Two approaches have been used: the first uses the finite element method for high-performance simulations of static biomechanical models, and the second generates a simulation developmenttool for experimenting with different dynamic models. A finite element program for static analysis has been parallelized for the MUSIC machine. To solve the sparse system of linear equations, a conjugate gradient solver (iterative method and a frontal solver (direct method have been implemented. The preprocessor required for the frontal solver is written in the modern functional programming language SML, the solver itself in C, thus exploiting the characteristic advantages of both functional and imperative programming. The speedup analysis of both solvers show very satisfactory results for this irregular problem. A mixed symbolic-numeric environment for rigid body system simulations is presented. It automatically generates C code from a problem specification expressed by the Lagrange formalism using Maple.

  11. Computer simulation of two-phase flow in nuclear reactors

    International Nuclear Information System (INIS)

    Wulff, W.

    1993-01-01

    Two-phase flow models dominate the requirements of economic resources for the development and use of computer codes which serve to analyze thermohydraulic transients in nuclear power plants. An attempt is made to reduce the effort of analyzing reactor transients by combining purpose-oriented modelling with advanced computing techniques. Six principles are presented on mathematical modeling and the selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited than the two-fluid model for the analysis of two-phase flow in nuclear reactors, because of the latter's closure problems. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost. (orig.)

  12. Simulation and mechanistic investigation of the arrhythmogenic role of the late sodium current in human heart failure.

    Directory of Open Access Journals (Sweden)

    Beatriz Trenor

    Full Text Available Heart failure constitutes a major public health problem worldwide. The electrophysiological remodeling of failing hearts sets the stage for malignant arrhythmias, in which the role of the late Na(+ current (I(NaL is relevant and is currently under investigation. In this study we examined the role of I(NaL in the electrophysiological phenotype of ventricular myocytes, and its proarrhythmic effects in the failing heart. A model for cellular heart failure was proposed using a modified version of Grandi et al. model for human ventricular action potential that incorporates the formulation of I(NaL. A sensitivity analysis of the model was performed and simulations of the pathological electrical activity of the cell were conducted. The proposed model for the human I(NaL and the electrophysiological remodeling of myocytes from failing hearts accurately reproduce experimental observations. The sensitivity analysis of the modulation of electrophysiological parameters of myocytes from failing hearts due to ion channels remodeling, revealed a role for I(NaL in the prolongation of action potential duration (APD, triangulation of the shape of the AP, and changes in Ca(2+ transient. A mechanistic investigation of intracellular Na(+ accumulation and APD shortening with increasing frequency of stimulation of failing myocytes revealed a role for the Na(+/K(+ pump, the Na(+/Ca(2+ exchanger and I(NaL. The results of the simulations also showed that in failing myocytes, the enhancement of I(NaL increased the reverse rate-dependent APD prolongation and the probability of initiating early afterdepolarizations. The electrophysiological remodeling of failing hearts and especially the enhancement of the I(NaL prolong APD and alter Ca(2+ transient facilitating the development of early afterdepolarizations. An enhanced I(NaL appears to be an important contributor to the electrophysiological phenotype and to the dysregulation of [Ca(2+](i homeostasis of failing myocytes.

  13. Computer simulation of molecular sorption in zeolites

    International Nuclear Information System (INIS)

    Calmiano, Mark Daniel

    2001-01-01

    The work presented in this thesis encompasses the computer simulation of molecular sorption. In Chapter 1 we outline the aims and objectives of this work. Chapter 2 follows in which an introduction to sorption in zeolites is presented, with discussion of structure and properties of the main zeolites studied. Chapter 2 concludes with a description of the principles and theories of adsorption. In Chapter 3 we describe the methodology behind the work carried out in this thesis. In Chapter 4 we present our first computational study, that of the sorption of krypton in silicalite. We describe work carried out to investigate low energy sorption sites of krypton in silicalite where we observe krypton to preferentially sorb into straight and sinusoidal channels over channel intersections. We simulate single step type I adsorption isotherms and use molecular dynamics to study the diffusion of krypton and obtain division coefficients and the activation energy. We compare our results to previous experimental and computational studies where we show our work to be in good agreement. In Chapter 5 we present a systematic study of the sorption of oxygen and nitrogen in five lithium substituted zeolites using a transferable interatomic potential that we have developed from ab initio calculations. We show increased loading of nitrogen compared to oxygen in all five zeolites studied as expected and simulate adsorption isotherms, which we compare to experimental and simulated data in the literature. In Chapter 6 we present work on the sorption of ferrocene in the zeolite NaY. We show that a simulated, low energy sorption site for ferrocene is correctly located by comparing to X-ray powder diffraction results for this same system. The thesis concludes with some overall conclusions and discussion of opportunities for future work. (author)

  14. Factors cost effectively improved using computer simulations of ...

    African Journals Online (AJOL)

    LPhidza

    effectively managed using computer simulations in semi-arid conditions pertinent to much of sub-Saharan Africa. ... small scale farmers to obtain optimal crop yields thus ensuring their food security and livelihood is ... those that simultaneously incorporate and simulate processes involved throughout the course of crop ...

  15. Computer simulation of the effect of dDAVP with saline loading on fluid balance after 24-hour head-down tilt

    Science.gov (United States)

    Srinivasan, R. S.; Simanonok, K. E.; Charles, J. B.

    1994-01-01

    Fluid loading (FL) before Shuttle reentry is a countermeasure currently in use by NASA to improve the orthostatic tolerance of astronauts during reentry and postflight. The fluid load consists of water and salt tablets equivalent to 32 oz (946 ml) of isotonic saline. However, the effectiveness of this countermeasure has been observed to decrease with the duration of spaceflight. The countermeasure's effectiveness may be improved by enhancing fluid retention using analogs of vasopressin such as lypressin (LVP) and desmopressin (dDAVP). In a computer simulation study reported previously, we attempted to assess the improvement in fluid retention obtained by the use of LVP administered before FL. The present study is concerned with the use of dDAVP. In a recent 24-hour, 6 degree head-down tilt (HDT) study involving seven men, dDAVP was found to improve orthostatic tolerance as assessed by both lower body negative pressure (LBNP) and stand tests. The treatment restored Luft's cumulative stress index (cumulative product of magnitude and duration of LBNP) to nearly pre-bedrest level. The heart rate was lower and stroke volume was marginally higher at the same LBNP levels with administration of dDAVP compared to placebo. Lower heart rates were also observed with dDAVP during stand test, despite the lower level of cardiovascular stress. These improvements were seen with only a small but significant increase in plasma volume of approximately 3 percent. This paper presents a computer simulation analysis of some of the results of this HDT study.

  16. CloudMC: a cloud computing application for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-01-01

    This work presents CloudMC, a cloud computing application—developed in Windows Azure®, the platform of the Microsoft® cloud—for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based—the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice. (note)

  17. CloudMC: a cloud computing application for Monte Carlo simulation.

    Science.gov (United States)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  18. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  19. Formal Analysis of Dynamics Within Philosophy of Mind by Computer Simulation

    NARCIS (Netherlands)

    Bosse, T.; Schut, M.C.; Treur, J.

    2009-01-01

    Computer simulations can be useful tools to support philosophers in validating their theories, especially when these theories concern phenomena showing nontrivial dynamics. Such theories are usually informal, whilst for computer simulation a formally described model is needed. In this paper, a

  20. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  1. Computer simulation studies in condensed-matter physics 5. Proceedings

    International Nuclear Information System (INIS)

    Landau, D.P.; Mon, K.K.; Schuettler, H.B.

    1993-01-01

    As the role of computer simulations began to increase in importance, we sensed a need for a ''meeting place'' for both experienced simulators and neophytes to discuss new techniques and results in an environment which promotes extended discussion. As a consequence of these concerns, The Center for Simulational Physics established an annual workshop on Recent Developments in Computer Simulation Studies in Condensed-Matter Physics. This year's workshop was the fifth in this series and the interest which the scientific community has shown demonstrates quite clearly the useful purpose which the series has served. The workshop was held at the University of Georgia, February 17-21, 1992, and these proceedings from a record of the workshop which is published with the goal of timely dissemination of the papers to a wider audience. The proceedings are divided into four parts. The first part contains invited papers which deal with simulational studies of classical systems and includes an introduction to some new simulation techniques and special purpose computers as well. A separate section of the proceedings is devoted to invited papers on quantum systems including new results for strongly correlated electron and quantum spin models. The third section is comprised of a single, invited description of a newly developed software shell designed for running parallel programs. The contributed presentations comprise the final chapter. (orig.). 79 figs

  2. A compositional reservoir simulator on distributed memory parallel computers

    International Nuclear Information System (INIS)

    Rame, M.; Delshad, M.

    1995-01-01

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. A portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented

  3. Correlation of radiation dose and heart rate in dual-source computed tomography coronary angiography

    International Nuclear Information System (INIS)

    Laspas, Fotios; Roussakis, Arkadios; Kritikos, Nikolaos; Efthimiadou, Roxani; Kehagias, Dimitrios; Andreou, John; Tsantioti, Dimitra

    2011-01-01

    Background: Computed tomography coronary angiography (CTCA) has been widely used since the introduction of 64-slice scanners and dual-source CT technology, but the relatively high radiation dose remains a major concern. Purpose: To evaluate the relationship between radiation exposure and heart rate (HR), in dual-source CTCA. Material and Methods: Data from 218 CTCA examinations, performed with a dual-source 64-slices scanner, were statistically evaluated. Effective radiation dose, expressed in mSv, was calculated as the product of the dose-length product (DLP) times a conversion coefficient for the chest (mSv = DLPx0.017). Heart rate range and mean heart rate, expressed in beats per minute (bpm) of each individual during CTCA, were also provided by the system. Statistical analysis of effective dose and heart rate data was performed by using Pearson correlation coefficient and two-sample t-test. Results: Mean HR and effective dose were found to have a borderline positive relationship. Individuals with a mean HR >65 bpm observed to receive a statistically significant higher effective dose as compared to those with a mean HR =65 bpm. Moreover, a strong correlation between effective dose and variability of HR of more than 20 bpm was observed. Conclusion: Dual-source CT scanners are considered to have the capability to provide diagnostic examinations even with high HR and arrhythmias. However, it is desirable to keep the mean heart rate below 65 bpm and heart rate fluctuation less than 20 bpm in order to reduce the radiation exposure

  4. Computer simulation of ultrasonic waves in solids

    International Nuclear Information System (INIS)

    Thibault, G.A.; Chaplin, K.

    1992-01-01

    A computer model that simulates the propagation of ultrasonic waves has been developed at AECL Research, Chalk River Laboratories. This program is called EWE, short for Elastic Wave Equations, the mathematics governing the propagation of ultrasonic waves. This report contains a brief summary of the use of ultrasonic waves in non-destructive testing techniques, a discussion of the EWE simulation code explaining the implementation of the equations and the types of output received from the model, and an example simulation showing the abilities of the model. (author). 2 refs., 2 figs

  5. Radiation dose management for pediatric cardiac computed tomography: a report from the Image Gently 'Have-A-Heart' campaign.

    Science.gov (United States)

    Rigsby, Cynthia K; McKenney, Sarah E; Hill, Kevin D; Chelliah, Anjali; Einstein, Andrew J; Han, B Kelly; Robinson, Joshua D; Sammet, Christina L; Slesnick, Timothy C; Frush, Donald P

    2018-01-01

    Children with congenital or acquired heart disease can be exposed to relatively high lifetime cumulative doses of ionizing radiation from necessary medical imaging procedures including radiography, fluoroscopic procedures including diagnostic and interventional cardiac catheterizations, electrophysiology examinations, cardiac computed tomography (CT) studies, and nuclear cardiology examinations. Despite the clinical necessity of these imaging studies, the related ionizing radiation exposure could pose an increased lifetime attributable cancer risk. The Image Gently "Have-A-Heart" campaign is promoting the appropriate use of medical imaging studies in children with congenital or acquired heart disease while minimizing radiation exposure. The focus of this manuscript is to provide a comprehensive review of radiation dose management and CT performance in children with congenital or acquired heart disease.

  6. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    International Nuclear Information System (INIS)

    Foster, C.

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  7. Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations

    Science.gov (United States)

    Eskandari Nasrabad, A.; Laghaei, R.

    2018-04-01

    Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.

  8. Computer simulation for sodium-concrete reactions

    International Nuclear Information System (INIS)

    Zhang Bin; Zhu Jizhou

    2006-01-01

    In the liquid metal cooled fast breeder reactors (LMFBRs), direct contacts between sodium and concrete is unavoidable. Due to sodium's high chemical reactivity, sodium would react with concrete violently. Lots of hydrogen gas and heat would be released then. This would harm the ignorantly of the containment. This paper developed a program to simualte sodium-conrete reactions across-the-board. It could give the reaction zone temperature, pool temperature, penetration depth, penetration rate, hydrogen flux and reaction heat and so on. Concrete was considered to be composed of silica and water only in this paper. The variable, the quitient of sodium hydroxide, was introduced in the continuity equation to simulate the chemical reactions more realistically. The product of the net gas flux and boundary depth was ably transformed to that of penetration rate and boundary depth. The complex chemical kinetics equations was simplified under some hypothesises. All the technique applied above simplified the computer simulation consumedly. In other words, they made the computer simulation feasible. Theoretics models that applied in the program and the calculation procedure were expatiated in detail. Good agreements of an overall transient behavior were obtained in the series of sodium-concrete reaction experiment analysis. The comparison between the analytical and experimental results showed the program presented in this paper was creditable and reasonable for simulating the sodium-concrete reactions. This program could be used for nuclear safety judgement. (authors)

  9. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  10. Computer Graphics Simulations of Sampling Distributions.

    Science.gov (United States)

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  11. Computer simulation of nonequilibrium processes

    International Nuclear Information System (INIS)

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed

  12. Building an adiabatic quantum computer simulation in the classroom

    Science.gov (United States)

    Rodríguez-Laguna, Javier; Santalla, Silvia N.

    2018-05-01

    We present a didactic introduction to adiabatic quantum computation (AQC) via the explicit construction of a classical simulator of quantum computers. This constitutes a suitable route to introduce several important concepts for advanced undergraduates in physics: quantum many-body systems, quantum phase transitions, disordered systems, spin-glasses, and computational complexity theory.

  13. Quantum computer gate simulations | Dada | Journal of the Nigerian ...

    African Journals Online (AJOL)

    A new interactive simulator for Quantum Computation has been developed for simulation of the universal set of quantum gates and for construction of new gates of up to 3 qubits. The simulator also automatically generates an equivalent quantum circuit for any arbitrary unitary transformation on a qubit. Available quantum ...

  14. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  15. Advanced computational simulations of water waves interacting with wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  16. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  17. Cluster computing for lattice QCD simulations

    International Nuclear Information System (INIS)

    Coddington, P.D.; Williams, A.G.

    2000-01-01

    Full text: Simulations of lattice quantum chromodynamics (QCD) require enormous amounts of compute power. In the past, this has usually involved sharing time on large, expensive machines at supercomputing centres. Over the past few years, clusters of networked computers have become very popular as a low-cost alternative to traditional supercomputers. The dramatic improvements in performance (and more importantly, the ratio of price/performance) of commodity PCs, workstations, and networks have made clusters of off-the-shelf computers an attractive option for low-cost, high-performance computing. A major advantage of clusters is that since they can have any number of processors, they can be purchased using any sized budget, allowing research groups to install a cluster for their own dedicated use, and to scale up to more processors if additional funds become available. Clusters are now being built for high-energy physics simulations. Wuppertal has recently installed ALiCE, a cluster of 128 Alpha workstations running Linux, with a peak performance of 158 G flops. The Jefferson Laboratory in the US has a 16 node Alpha cluster and plans to upgrade to a 256 processor machine. In Australia, several large clusters have recently been installed. Swinburne University of Technology has a cluster of 64 Compaq Alpha workstations used for astrophysics simulations. Early this year our DHPC group constructed a cluster of 116 dual Pentium PCs (i.e. 232 processors) connected by a Fast Ethernet network, which is used by chemists at Adelaide University and Flinders University to run computational chemistry codes. The Australian National University has recently installed a similar PC cluster with 192 processors. The Centre for the Subatomic Structure of Matter (CSSM) undertakes large-scale high-energy physics calculations, mainly lattice QCD simulations. The choice of the computer and network hardware for a cluster depends on the particular applications to be run on the machine. Our

  18. Computer Networks E-learning Based on Interactive Simulations and SCORM

    Directory of Open Access Journals (Sweden)

    Francisco Andrés Candelas

    2011-05-01

    Full Text Available This paper introduces a new set of compact interactive simulations developed for the constructive learning of computer networks concepts. These simulations, which compose a virtual laboratory implemented as portable Java applets, have been created by combining EJS (Easy Java Simulations with the KivaNS API. Furthermore, in this work, the skills and motivation level acquired by the students are evaluated and measured when these simulations are combined with Moodle and SCORM (Sharable Content Object Reference Model documents. This study has been developed to improve and stimulate the autonomous constructive learning in addition to provide timetable flexibility for a Computer Networks subject.

  19. Utility of computed axial tomography angiography in anatomic evaluation of pediatric patients with congenital heart diseases

    International Nuclear Information System (INIS)

    Mosquera, Walter A; Reyes P, Rafael; Aguilera N, Favio M; Breton, Cesar A; Buitrago, Danuby A; Suarez J Ramiro; Castillo, Victor

    2007-01-01

    Although echocardiogram and cardiac catheterization are used as first option tools for congenital heart diseases diagnosis, computed tomography angiography is a minimally invasive exam that through two to three dimensional images in real time gives an adequate approach to patients having this type of pathologies that require a rapid and precise evaluation of its extra cardiac anatomy. Objective: describe the institutional experience from August 2005 to August 2006 in the use of angiography by tomography as a complementary diagnostic method in the evaluation of pediatric patients with congenital heart diseases. Method: serial descriptive study. 58 pediatric patients with clinical and echocardiographic diagnosis of congenital heart diseases were evaluated through the General Electric Multislice Light peed/16 scanner. Results: 58 patients with history of congenital heart disease were evaluated through CT angiography. Mean age was 2.4 ± 4.03 years. Twenty (33.8%) had diagnosis of pulmonary atresia, four (6.7%) had tricuspid atresia, eight (13.5%,) had double-outlet right ventricle, seven (11.8%) had tetralogy of Fallot, nine (15.2%) had alterations of the aortic arch, seven (11.8%) had coarctation of the aorta two (3.3%) had interrupted aortic arch, six (10.3%) had persistent ductus arteriosus, four (6.7%) had anomalous venous drainage and three (5.1 %) had transposition of the great arteries. High quality images that allowed assessing the precise vascular anatomy were obtained. Conclusions: computed tomography angiography turned out to be a useful tool in the diagnostic approach of congenital heart diseases, because it allowed a tridimensional anatomic reconstruction. New studies that may permit the assessment of sensitivity, specificity and concordance level of this technique with other invasive diagnostic methods available for the diagnosis of this type of diseases, are required

  20. Computer Simulation of Angle-measuring System of Photoelectric Theodolite

    International Nuclear Information System (INIS)

    Zeng, L; Zhao, Z W; Song, S L; Wang, L T

    2006-01-01

    In this paper, a virtual test platform based on malfunction phenomena is designed, using the methods of computer simulation and numerical mask. It is used in the simulation training of angle-measuring system of photoelectric theodolite. Actual application proves that this platform supplies good condition for technicians making deep simulation training and presents a useful approach for the establishment of other large equipment simulation platforms

  1. Reconstruction of electrocardiogram using ionic current models for heart muscles.

    Science.gov (United States)

    Yamanaka, A; Okazaki, K; Urushibara, S; Kawato, M; Suzuki, R

    1986-11-01

    A digital computer model is presented for the simulation of the electrocardiogram during ventricular activation and repolarization (QRS-T waves). The part of the ventricular septum and the left ventricular free wall of the heart are represented by a two dimensional array of 730 homogeneous functional units. Ionic currents models are used to determine the spatial distribution of the electrical activities of these units at each instant of time during simulated cardiac cycle. In order to reconstruct the electrocardiogram, the model is expanded three-dimensionally with equipotential assumption along the third axis and then the surface potentials are calculated using solid angle method. Our digital computer model can be used to improve the understanding of the relationship between body surface potentials and intracellular electrical events.

  2. GPU Accelerated Surgical Simulators for Complex Morhpology

    DEFF Research Database (Denmark)

    Mosegaard, Jesper; Sørensen, Thomas Sangild

    2005-01-01

    a springmass system in order to simulate a complex organ such as the heart. Computations are accelerated by taking advantage of modern graphics processing units (GPUs). Two GPU implementations are presented. They vary in their generality of spring connections and in the speedup factor they achieve...

  3. What do we want from computer simulation of SIMS using clusters?

    International Nuclear Information System (INIS)

    Webb, R.P.

    2008-01-01

    Computer simulation of energetic cluster interactions with surfaces has provided much needed insight into some of the complex processes which occur and are responsible for the desirable as well as undesirable effects which make the use of clusters in SIMS both useful and challenging. Simulations have shown how cluster impacts can cause meso-scale motion of the target material which can result in the relatively gentle up-lift of large intact molecules adsorbed on the surface in contrast to the behaviour of single atom impacts which tend to create discrete motion in the surface often ejecting fragments of adsorbed molecules instead. With the insight provided from simulations experimentalists can then improve their equipment to best maximise the desired effects. The past 40 years has seen great progress in simulation techniques and computer equipment. 40 years ago simulations were performed on simple atomic systems of around 300 atoms employing only simple pair-wise interaction potentials to times of several hundred femtoseconds. Currently simulations can be performed on large organic materials employing many body potentials for millions of atoms for times of many picoseconds. These simulations, however, can take several months of computation time. Even with the degree of realism introduced with these long time simulations they are still not perfect are often not capable of being used in a completely predictive way. Computer simulation is reaching a position where by any more effort to increase its realism will make it completely intractable to solution in a reasonable time frame and yet there is an increasing demand from experimentalists for something that can help in a predictive way to help in experiment design and interpretation. This paper will discuss the problems of computer simulation and what might be possible to achieve in the short term, what is unlikely ever to be possible without a major new break through and how we might exploit the meso-scale effects in

  4. Validation and computing and performance studies for the ATLAS simulation

    CERN Document Server

    Marshall, Z; The ATLAS collaboration

    2009-01-01

    We present the validation of the ATLAS simulation software pro ject. Software development is controlled by nightly builds and several levels of automatic tests to ensure stability. Computing validation, including CPU time, memory, and disk space required per event, is benchmarked for all software releases. Several different physics processes and event types are checked to thoroughly test all aspects of the detector simulation. The robustness of the simulation software is demonstrated by the production of 500 million events on the World-wide LHC Computing Grid in the last year.

  5. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    Science.gov (United States)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  6. Computational simulator of robotic manipulators

    International Nuclear Information System (INIS)

    Leal, Alexandre S.; Campos, Tarcisio P.R.

    1995-01-01

    Robotic application for industrial plants is discussed and a computational model for a mechanical manipulator of three links is presented. A neural network feed-forward type has been used to model the dynamic control of the manipulator. A graphic interface was developed in C programming language as a virtual world in order to visualize and simulate the arm movements handling radioactive waste environment. (author). 7 refs, 5 figs

  7. Macromod: Computer Simulation For Introductory Economics

    Science.gov (United States)

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  8. An Investigation of Computer-based Simulations for School Crises Management.

    Science.gov (United States)

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  9. The use of micro-computers in the simulation of ion beam optics

    International Nuclear Information System (INIS)

    Spaedtke, P.; Ivens, D.

    1989-01-01

    With computer simulation codes specific problems of the ion beam optics can be studied, which is useful in the design as in optimization of existing systems. Several such codes have been developed, unfortunately requiring substantial computer resources. Recent advances of mini- and micro-computers have now made it possible to develop simulation codes which can be run on these small computers also. In this paper, some of these codes will be presented and their computing time discussed. (author)

  10. Computer simulation of driven Alfven waves

    International Nuclear Information System (INIS)

    Geary, J.L. Jr.

    1986-01-01

    The first particle simulation study of shear Alfven wave resonance heating is presented. Particle simulation codes self-consistently follow the time evolution of the individual and collective aspects of particle dynamics as well as wave dynamics in a fully nonlinear fashion. Alfven wave heating is a possible means of increasing the temperature of magnetized plasmas. A new particle simulation model was developed for this application that incorporates Darwin's formulation of the electromagnetic fields with a guiding center approximation for electron motion perpendicular to the ambient magnetic field. The implementation of this model and the examination of its theoretical and computational properties are presented. With this model, several cases of Alfven wave heating is examined in both uniform and nonuniform simulation systems in a two dimensional slab. For the inhomogeneous case studies, the kinetic Alfven wave develops in the vicinity of the shear Alfven resonance region

  11. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  12. Computational methods for coupling microstructural and micromechanical materials response simulations

    Energy Technology Data Exchange (ETDEWEB)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  13. Positive Wigner functions render classical simulation of quantum computation efficient.

    Science.gov (United States)

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  14. Assessing Practical Skills in Physics Using Computer Simulations

    Science.gov (United States)

    Walsh, Kevin

    2018-01-01

    Computer simulations have been used very effectively for many years in the teaching of science but the focus has been on cognitive development. This study, however, is an investigation into the possibility that a student's experimental skills in the real-world environment can be judged via the undertaking of a suitably chosen computer simulation…

  15. Computer simulation of stair falls to investigate scenarios in child abuse.

    Science.gov (United States)

    Bertocci, G E; Pierce, M C; Deemer, E; Aguel, F

    2001-09-01

    To demonstrate the usefulness of computer simulation techniques in the investigation of pediatric stair falls. Since stair falls are a common falsely reported injury scenario in child abuse, our specific aim was to investigate the influence of stair characteristics on injury biomechanics of pediatric stair falls by using a computer simulation model. Our long-term goal is to use knowledge of biomechanics to aid in distinguishing between accidents and abuse. A computer simulation model of a 3-year-old child falling down stairs was developed using commercially available simulation software. This model was used to investigate the influence that stair characteristics have on biomechanical measures associated with injury risk. Since femur fractures occur in unintentional and abuse scenarios, biomechanical measures were focused on the lower extremities. The number and slope of steps and stair surface friction and elasticity were found to affect biomechanical measures associated with injury risk. Computer simulation techniques are useful for investigating the biomechanics of stair falls. Using our simulation model, we determined that stair characteristics have an effect on potential for lower extremity injuries. Although absolute values of biomechanical measures should not be relied on in an unvalidated model such as this, relationships between accident-environment factors and biomechanical measures can be studied through simulation. Future efforts will focus on model validation.

  16. Evaluation of Rankine cycle air conditioning system hardware by computer simulation

    Science.gov (United States)

    Healey, H. M.; Clark, D.

    1978-01-01

    A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.

  17. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  18. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  19. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  20. Computer Simulation Surgery for Mandibular Reconstruction Using a Fibular Osteotomy Guide

    Directory of Open Access Journals (Sweden)

    Woo Shik Jeong

    2014-09-01

    Full Text Available In the present study, a fibular osteotomy guide based on a computer simulation was applied to a patient who had undergone mandibular segmental ostectomy due to oncological complications. This patient was a 68-year-old woman who presented to our department with a biopsy-proven squamous cell carcinoma on her left gingival area. This lesion had destroyed the cortical bony structure, and the patient showed attenuation of her soft tissue along the inferior alveolar nerve, indicating perineural spread of the tumor. Prior to surgery, a three-dimensional computed tomography scan of the facial and fibular bones was performed. We then created a virtual computer simulation of the mandibular segmental defect through which we segmented the fibular to reconstruct the proper angulation in the original mandible. Approximately 2-cm segments were created on the basis of this simulation and applied to the virtually simulated mandibular segmental defect. Thus, we obtained a virtual model of the ideal mandibular reconstruction for this patient with a fibular free flap. We could then use this computer simulation for the subsequent surgery and minimize the bony gaps between the multiple fibular bony segments.

  1. SNOW: a digital computer program for the simulation of ion beam devices

    International Nuclear Information System (INIS)

    Boers, J.E.

    1980-08-01

    A digital computer program, SNOW, has been developed for the simulation of dense ion beams. The program simulates the plasma expansion cup (but not the plasma source itself), the acceleration region, and a drift space with neutralization if desired. The ion beam is simulated by computing representative trajectories through the device. The potentials are simulated on a large rectangular matrix array which is solved by iterative techniques. Poisson's equation is solved at each point within the configuration using space-charge densities computed from the ion trajectories combined with background electron and/or ion distributions. The simulation methods are described in some detail along with examples of both axially-symmetric and rectangular beams. A detailed description of the input data is presented

  2. Large Atmospheric Computation on the Earth Simulator: The LACES Project

    Directory of Open Access Journals (Sweden)

    Michel Desgagné

    2006-01-01

    Full Text Available The Large Atmospheric Computation on the Earth Simulator (LACES project is a joint initiative between Canadian and Japanese meteorological services and academic institutions that focuses on the high resolution simulation of Hurricane Earl (1998. The unique aspect of this effort is the extent of the computational domain, which covers all of North America and Europe with a grid spacing of 1 km. The Canadian Mesoscale Compressible Community (MC2 model is shown to parallelize effectively on the Japanese Earth Simulator (ES supercomputer; however, even using the extensive computing resources of the ES Center (ESC, the full simulation for the majority of Hurricane Earl's lifecycle takes over eight days to perform and produces over 5.2 TB of raw data. Preliminary diagnostics show that the results of the LACES simulation for the tropical stage of Hurricane Earl's lifecycle compare well with available observations for the storm. Further studies involving advanced diagnostics have commenced, taking advantage of the uniquely large spatial extent of the high resolution LACES simulation to investigate multiscale interactions in the hurricane and its environment. It is hoped that these studies will enhance our understanding of processes occurring within the hurricane and between the hurricane and its planetary-scale environment.

  3. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  4. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  5. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Directory of Open Access Journals (Sweden)

    Jakob Jordan

    2018-02-01

    Full Text Available State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  6. Computer simulation of fatigue under diametrical compression

    OpenAIRE

    Carmona, H. A.; Kun, F.; Andrade Jr., J. S.; Herrmann, H. J.

    2006-01-01

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue, and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows to follow the development of the fracture process on the macro- and micro-level varying the relative influence of the mechanisms of damage accumulation over the ...

  7. Computer simulations and the changing face of scientific experimentation

    CERN Document Server

    Duran, Juan M

    2013-01-01

    Computer simulations have become a central tool for scientific practice. Their use has replaced, in many cases, standard experimental procedures. This goes without mentioning cases where the target system is empirical but there are no techniques for direct manipulation of the system, such as astronomical observation. To these cases, computer simulations have proved to be of central importance. The question about their use and implementation, therefore, is not only a technical one but represents a challenge for the humanities as well. In this volume, scientists, historians, and philosophers joi

  8. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  9. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  10. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  11. simulate_CAT: A Computer Program for Post-Hoc Simulation for Computerized Adaptive Testing

    Directory of Open Access Journals (Sweden)

    İlker Kalender

    2015-06-01

    Full Text Available This paper presents a computer software developed by the author. The software conducts post-hoc simulations for computerized adaptive testing based on real responses of examinees to paper and pencil tests under different parameters that can be defined by user. In this paper, short information is given about post-hoc simulations. After that, the working principle of the software is provided and a sample simulation with required input files is shown. And last, output files are described

  12. The challenge of quantum computer simulations of physical phenomena

    International Nuclear Information System (INIS)

    Ortiz, G.; Knill, E.; Gubernatis, J.E.

    2002-01-01

    The goal of physics simulation using controllable quantum systems ('physics imitation') is to exploit quantum laws to advantage, and thus accomplish efficient simulation of physical phenomena. In this Note, we discuss the fundamental concepts behind this paradigm of information processing, such as the connection between models of computation and physical systems. The experimental simulation of a toy quantum many-body problem is described

  13. High performance stream computing for particle beam transport simulations

    International Nuclear Information System (INIS)

    Appleby, R; Bailey, D; Higham, J; Salt, M

    2008-01-01

    Understanding modern particle accelerators requires simulating charged particle transport through the machine elements. These simulations can be very time consuming due to the large number of particles and the need to consider many turns of a circular machine. Stream computing offers an attractive way to dramatically improve the performance of such simulations by calculating the simultaneous transport of many particles using dedicated hardware. Modern Graphics Processing Units (GPUs) are powerful and affordable stream computing devices. The results of simulations of particle transport through the booster-to-storage-ring transfer line of the DIAMOND synchrotron light source using an NVidia GeForce 7900 GPU are compared to the standard transport code MAD. It is found that particle transport calculations are suitable for stream processing and large performance increases are possible. The accuracy and potential speed gains are compared and the prospects for future work in the area are discussed

  14. Computer simulation of variform fuel assemblies using Dragon code

    International Nuclear Information System (INIS)

    Ju Haitao; Wu Hongchun; Yao Dong

    2005-01-01

    The DRAGON is a cell code that developed for the CANDU reactor by the Ecole Polytechnique de Montreal of CANADA. Although, the DRAGON is mainly used to simulate the CANDU super-cell fuel assembly, it has an ability to simulate other geometries of the fuel assembly. However, only NEACRP benchmark problem of the BWR lattice cell was analyzed until now except for the CANDU reactor. We also need to develop the code to simulate the variform fuel assemblies, especially, for design of the advanced reactor. We validated that the cell code DRAGON is useful for simulating various kinds of the fuel assembly by analyzing the rod-shape fuel assembly of the PWR and the MTR plate-shape fuel assembly. Some other kinds of geometry of geometry were computed. Computational results show that the DRAGON is able to analyze variform fuel assembly problems and the precision is high. (authors)

  15. Teaching Computer Organization and Architecture Using Simulation and FPGA Applications

    OpenAIRE

    D. K.M. Al-Aubidy

    2007-01-01

    This paper presents the design concepts and realization of incorporating micro-operation simulation and FPGA implementation into a teaching tool for computer organization and architecture. This teaching tool helps computer engineering and computer science students to be familiarized practically with computer organization and architecture through the development of their own instruction set, computer programming and interfacing experiments. A two-pass assembler has been designed and implemente...

  16. An introduction to computer simulation methods applications to physical systems

    CERN Document Server

    Gould, Harvey; Christian, Wolfgang

    2007-01-01

    Now in its third edition, this book teaches physical concepts using computer simulations. The text incorporates object-oriented programming techniques and encourages readers to develop good programming habits in the context of doing physics. Designed for readers at all levels , An Introduction to Computer Simulation Methods uses Java, currently the most popular programming language. Introduction, Tools for Doing Simulations, Simulating Particle Motion, Oscillatory Systems, Few-Body Problems: The Motion of the Planets, The Chaotic Motion of Dynamical Systems, Random Processes, The Dynamics of Many Particle Systems, Normal Modes and Waves, Electrodynamics, Numerical and Monte Carlo Methods, Percolation, Fractals and Kinetic Growth Models, Complex Systems, Monte Carlo Simulations of Thermal Systems, Quantum Systems, Visualization and Rigid Body Dynamics, Seeing in Special and General Relativity, Epilogue: The Unity of Physics For all readers interested in developing programming habits in the context of doing phy...

  17. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  18. Refining Pragmatically-Appropriate Oral Communication via Computer-Simulated Conversations

    Science.gov (United States)

    Sydorenko, Tetyana; Daurio, Phoebe; Thorne, Steven L.

    2018-01-01

    To address the problem of limited opportunities for practicing second language speaking in interaction, especially delicate interactions requiring pragmatic competence, we describe computer simulations designed for the oral practice of extended pragmatic routines and report on the affordances of such simulations for learning pragmatically…

  19. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  20. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  1. Computer Simulation of the Relationship between Selected Properties of PVD Coatings

    Directory of Open Access Journals (Sweden)

    Śliwa A.

    2016-06-01

    Full Text Available The possibility to apply the Finite Element Method to calculate internal stresses which occur in Ti+TiN, Ti+Ti(CxN1-x and Ti+TiC coatings obtained in the magnetron PVD process on the sintered high-speed steel of the PM HS6-5-3-8 type. For the purpose of computer simulation of internal stresses in coatings with the use of MES, the correct model of analyzed specimens was worked out and then it was experimentally verified by comparison of calculation results with the results of computer simulation. Accurate analysis of correlations indicated especially strong dependence between internal stresses and microhardness and between microhardness and erosion resistance what created conditions for establishing the dependence between internal stresses obtained in the result of computer simulation and erosion resistance as basic functional quality of coating. It has essential practical meaning because it allows to estimate predictable erosion resistance of coating exclusively on the base of the results of computer simulation for used parameters in the process of coating manufacturing.

  2. Comparison of real and computer-simulated outcomes of LASIK refractive surgery

    Science.gov (United States)

    Cano, Daniel; Barbero, Sergio; Marcos, Susana

    2004-06-01

    Computer simulations of alternative LASIK ablation patterns were performed for corneal elevation maps of 13 real myopic corneas (range of myopia, -2.0 to -11.5 D). The computationally simulated ablation patterns were designed with biconic surfaces (standard Munnerlyn pattern, parabolic pattern, and biconic pattern) or with aberrometry measurements (customized pattern). Simulated results were compared with real postoperative outcomes. Standard LASIK refractive surgery for myopia increased corneal asphericity and spherical aberration. Computations with the theoretical Munnerlyn ablation pattern did not increase the corneal asphericity and spherical aberration. The theoretical parabolic pattern induced a slight increase of asphericity and spherical aberration, explaining only 40% of the clinically found increase. The theoretical biconic pattern controlled corneal spherical aberration. Computations showed that the theoretical customized pattern can correct high-order asymmetric aberrations. Simulations of changes in efficiency due to reflection and nonnormal incidence of the laser light showed a further increase in corneal asphericity. Consideration of these effects with a parabolic pattern accounts for 70% of the clinical increase in asphericity.

  3. 4D blood flow mapping using SPIM-microPIV in the developing zebrafish heart

    Science.gov (United States)

    Zickus, Vytautas; Taylor, Jonathan M.

    2018-02-01

    Fluid-structure interaction in the developing heart is an active area of research in developmental biology. However, investigation of heart dynamics is mostly limited to computational uid dynamics simulations using heart wall structure information only, or single plane blood ow information - so there is a need for 3D + time resolved data to fully understand cardiac function. We present an imaging platform combining selective plane illumination microscopy (SPIM) with micro particle image velocimetry (μPIV) to enable 3D-resolved flow mapping in a microscopic environment, free from many of the sources of error and bias present in traditional epi uorescence-based μPIV systems. By using our new system in conjunction with optical heart beat synchronization, we demonstrate the ability obtain non-invasive 3D + time resolved blood flow measurements in the heart of a living zebrafish embryo.

  4. Use of computer simulations for the early introduction of nuclear engineering concepts

    International Nuclear Information System (INIS)

    Ougouag, A.M.; Zerguini, T.H.

    1985-01-01

    A sophomore level nuclear engineering (NE) course is being introduced at the University of Illinois. Via computer simulations, this course presents materials covering the most important aspects of the field. It is noted that computer simulations in nuclear engineering are cheaper and safer than experiments yet they provide an effective teaching tool for the early introduction of advanced concepts. The new course material can be used as a tutorial and for remedial learning. The use of computer simulation motivates learning since students associate computer activities with games. Such a course can help in the dissemination of the proper information to students from different fields, including liberal arts, and eventually increase undergraduate student enrollment in nuclear engineering

  5. Comprehensive Simulation Lifecycle Management for High Performance Computing Modeling and Simulation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — There are significant logistical barriers to entry-level high performance computing (HPC) modeling and simulation (M IllinoisRocstar) sets up the infrastructure for...

  6. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  7. Potential of Hybrid Computational Phantoms for Retrospective Heart Dosimetry After Breast Radiation Therapy: A Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    Moignier, Alexandra, E-mail: alexandra.moignier@irsn.fr [Institut de Radioprotection et de Surete Nucleaire, Fontenay-aux-Roses (France); Derreumaux, Sylvie; Broggio, David; Beurrier, Julien [Institut de Radioprotection et de Surete Nucleaire, Fontenay-aux-Roses (France); Chea, Michel; Boisserie, Gilbert [Groupe Hospitalier Pitie Salpetriere, Service de Radiotherapie, Paris (France); Franck, Didier; Aubert, Bernard [Institut de Radioprotection et de Surete Nucleaire, Fontenay-aux-Roses (France); Mazeron, Jean-Jacques [Groupe Hospitalier Pitie Salpetriere, Service de Radiotherapie, Paris (France)

    2013-02-01

    Purpose: Current retrospective cardiovascular dosimetry studies are based on a representative patient or simple mathematic phantoms. Here, a process of patient modeling was developed to personalize the anatomy of the thorax and to include a heart model with coronary arteries. Methods and Materials: The patient models were hybrid computational phantoms (HCPs) with an inserted detailed heart model. A computed tomography (CT) acquisition (pseudo-CT) was derived from HCP and imported into a treatment planning system where treatment conditions were reproduced. Six current patients were selected: 3 were modeled from their CT images (A patients) and the others were modelled from 2 orthogonal radiographs (B patients). The method performance and limitation were investigated by quantitative comparison between the initial CT and the pseudo-CT, namely, the morphology and the dose calculation were compared. For the B patients, a comparison with 2 kinds of representative patients was also conducted. Finally, dose assessment was focused on the whole coronary artery tree and the left anterior descending coronary. Results: When 3-dimensional anatomic information was available, the dose calculations performed on the initial CT and the pseudo-CT were in good agreement. For the B patients, comparison of doses derived from HCP and representative patients showed that the HCP doses were either better or equivalent. In the left breast radiation therapy context and for the studied cases, coronary mean doses were at least 5-fold higher than heart mean doses. Conclusions: For retrospective dose studies, it is suggested that HCP offers a better surrogate, in terms of dose accuracy, than representative patients. The use of a detailed heart model eliminates the problem of identifying the coronaries on the patient's CT.

  8. Plant Closings and Capital Flight: A Computer-Assisted Simulation.

    Science.gov (United States)

    Warner, Stanley; Breitbart, Myrna M.

    1989-01-01

    A course at Hampshire College was designed to simulate the decision-making environment in which constituencies in a medium-sized city would respond to the closing and relocation of a major corporate plant. The project, constructed as a role simulation with a computer component, is described. (MLW)

  9. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  10. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)

    2015-06-15

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.

  11. Computational plasticity algorithm for particle dynamics simulations

    Science.gov (United States)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2018-01-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  12. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    Science.gov (United States)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  13. Topics in computer simulations of statistical systems

    International Nuclear Information System (INIS)

    Salvador, R.S.

    1987-01-01

    Several computer simulations studying a variety of topics in statistical mechanics and lattice gauge theories are performed. The first study describes a Monte Carlo simulation performed on Ising systems defined on Sierpinsky carpets of dimensions between one and four. The critical coupling and the exponent γ are measured as a function of dimension. The Ising gauge theory in d = 4 - epsilon, for epsilon → 0 + , is then studied by performing a Monte Carlo simulation for the theory defined on fractals. A high statistics Monte Carlo simulation for the three-dimensional Ising model is presented for lattices of sizes 8 3 to 44 3 . All the data obtained agrees completely, within statistical errors, with the forms predicted by finite-sizing scaling. Finally, a method to estimate numerically the partition function of statistical systems is developed

  14. A computer simulation model to compute the radiation transfer of mountainous regions

    Science.gov (United States)

    Li, Yuguang; Zhao, Feng; Song, Rui

    2011-11-01

    In mountainous regions, the radiometric signal recorded at the sensor depends on a number of factors such as sun angle, atmospheric conditions, surface cover type, and topography. In this paper, a computer simulation model of radiation transfer is designed and evaluated. This model implements the Monte Carlo ray-tracing techniques and is specifically dedicated to the study of light propagation in mountainous regions. The radiative processes between sun light and the objects within the mountainous region are realized by using forward Monte Carlo ray-tracing methods. The performance of the model is evaluated through detailed comparisons with the well-established 3D computer simulation model: RGM (Radiosity-Graphics combined Model) based on the same scenes and identical spectral parameters, which shows good agreements between these two models' results. By using the newly developed computer model, series of typical mountainous scenes are generated to analyze the physical mechanism of mountainous radiation transfer. The results show that the effects of the adjacent slopes are important for deep valleys and they particularly affect shadowed pixels, and the topographic effect needs to be considered in mountainous terrain before accurate inferences from remotely sensed data can be made.

  15. High performance computer code for molecular dynamics simulations

    International Nuclear Information System (INIS)

    Levay, I.; Toekesi, K.

    2007-01-01

    Complete text of publication follows. Molecular Dynamics (MD) simulation is a widely used technique for modeling complicated physical phenomena. Since 2005 we are developing a MD simulations code for PC computers. The computer code is written in C++ object oriented programming language. The aim of our work is twofold: a) to develop a fast computer code for the study of random walk of guest atoms in Be crystal, b) 3 dimensional (3D) visualization of the particles motion. In this case we mimic the motion of the guest atoms in the crystal (diffusion-type motion), and the motion of atoms in the crystallattice (crystal deformation). Nowadays, it is common to use Graphics Devices in intensive computational problems. There are several ways to use this extreme processing performance, but never before was so easy to programming these devices as now. The CUDA (Compute Unified Device) Architecture introduced by nVidia Corporation in 2007 is a very useful for every processor hungry application. A Unified-architecture GPU include 96-128, or more stream processors, so the raw calculation performance is 576(!) GFLOPS. It is ten times faster, than the fastest dual Core CPU [Fig.1]. Our improved MD simulation software uses this new technology, which speed up our software and the code run 10 times faster in the critical calculation code segment. Although the GPU is a very powerful tool, it has a strongly paralleled structure. It means, that we have to create an algorithm, which works on several processors without deadlock. Our code currently uses 256 threads, shared and constant on-chip memory, instead of global memory, which is 100 times slower than others. It is possible to implement the total algorithm on GPU, therefore we do not need to download and upload the data in every iteration. On behalf of maximal throughput, every thread run with the same instructions

  16. Understanding the requirements of self-expandable stents for heart valve replacement: Radial force, hoop force and equilibrium.

    Science.gov (United States)

    Cabrera, María Sol; Oomens, Cees W J; Baaijens, Frank P T

    2017-04-01

    A proper interpretation of the forces developed during stent crimping and deployment is of paramount importance for a better understanding of the requirements for successful heart valve replacement. The present study combines experimental and computational methods to assess the performance of a nitinol stent for tissue-engineered heart valve implantation. To validate the stent model, the mechanical response to parallel plate compression and radial crimping was evaluated experimentally. Finite element simulations showed good agreement with the experimental findings. The computational models were further used to determine the hoop force on the stent and radial force on a rigid tool during crimping and self-expansion. In addition, stent deployment against ovine and human pulmonary arteries was simulated to determine the hoop force on the stent-artery system and the equilibrium diameter for different degrees of oversizing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. The value of flat-detector computed tomography during catheterisation of congenital heart disease

    International Nuclear Information System (INIS)

    Gloeckler, Martin; Koch, Andreas; Greim, Verena; Shabaiek, Amira; Dittrich, Sven; Rueffer, Andre; Cesnjevar, Robert; Achenbach, Stephan

    2011-01-01

    To analyse the diagnostic utility of flat-detector computed tomography imaging (FD-CT) in patients with congenital heart disease, including the value of image fusion to overlay three-dimensional (3D) reconstructions on fluoroscopic images during catheter-based interventions. We retrospectively analysed 62 consecutive paediatric patients in whom FD-CT was used during catheterisation of congenital heart disease. Expert operators rated the clinical value of FD-CT over conventional fluoroscopic imaging. Added radiation exposure and contrast medium volume were evaluated. During a 12-month period, FD-CT was performed in 62 out of 303 cardiac catheterisations. Median patient age was 3.5 years. In 32/62 cases, FD-CT was used for diagnostic purposes, in 30/62 cases it was used in the context of interventions. Diagnostic utility was never rated as ''misleading''. It was classified as ''not useful'' in six cases (9.7%), ''useful'' in 18 cases (29.0%), ''very useful'' in 37 cases (59.7%) and ''essential'' in one case (1.6%). The median added dose-area product was 111.0 μGym 2 , the required additional quantity of contrast medium was 1.6 ml/kg. FD-CT provides useful diagnostic information in most of the patients investigated for congenital heart disease. The added radiation exposure and contrast medium volume are reasonable. (orig.)

  18. Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-09-16

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.

  19. Computer simulation as an operational and training aid

    International Nuclear Information System (INIS)

    Lee, D.J.; Tottman-Trayner, E.

    1995-01-01

    The paper describes how the rapid development of desktop computing power, the associated fall in prices, and the advancement of computer graphics technology driven by the entertainment industry has enabled the nuclear industry to achieve improvements in operation and training through the use of computer simulation. Applications are focused on the fuel handling operations at Torness Power Station where visualization through computer modelling is being used to enhance operator awareness and to assist in a number of operational scenarios. It is concluded that there are significant benefits to be gained from the introduction of the facility at Torness as well as other locations. (author)

  20. Integration of adaptive process control with computational simulation for spin-forming

    International Nuclear Information System (INIS)

    Raboin, P. J. LLNL

    1998-01-01

    Improvements in spin-forming capabilities through upgrades to a metrology and machine control system and advances in numerical simulation techniques were studied in a two year project funded by Laboratory Directed Research and Development (LDRD) at Lawrence Livermore National Laboratory. Numerical analyses were benchmarked with spin-forming experiments and computational speeds increased sufficiently to now permit actual part forming simulations. Extensive modeling activities examined the simulation speeds and capabilities of several metal forming computer codes for modeling flat plate and cylindrical spin-forming geometries. Shape memory research created the first numerical model to describe this highly unusual deformation behavior in Uranium alloys. A spin-forming metrology assessment led to sensor and data acquisition improvements that will facilitate future process accuracy enhancements, such as a metrology frame. Finally, software improvements (SmartCAM) to the manufacturing process numerically integrate the part models to the spin-forming process and to computational simulations

  1. Surgical resource utilization in urban terrorist bombing: a computer simulation.

    Science.gov (United States)

    Hirshberg, A; Stein, M; Walden, R

    1999-09-01

    The objective of this study was to analyze the utilization of surgical staff and facilities during an urban terrorist bombing incident. A discrete-event computer model of the emergency room and related hospital facilities was constructed and implemented, based on cumulated data from 12 urban terrorist bombing incidents in Israel. The simulation predicts that the admitting capacity of the hospital depends primarily on the number of available surgeons and defines an optimal staff profile for surgeons, residents, and trauma nurses. The major bottlenecks in the flow of critical casualties are the shock rooms and the computed tomographic scanner but not the operating rooms. The simulation also defines the number of reinforcement staff needed to treat noncritical casualties and shows that radiology is the major obstacle to the flow of these patients. Computer simulation is an important new tool for the optimization of surgical service elements for a multiple-casualty situation.

  2. [The research on bidirectional reflectance computer simulation of forest canopy at pixel scale].

    Science.gov (United States)

    Song, Jin-Ling; Wang, Jin-Di; Shuai, Yan-Min; Xiao, Zhi-Qiang

    2009-08-01

    Computer simulation is based on computer graphics to generate the realistic 3D structure scene of vegetation, and to simulate the canopy regime using radiosity method. In the present paper, the authors expand the computer simulation model to simulate forest canopy bidirectional reflectance at pixel scale. But usually, the trees are complex structures, which are tall and have many branches. So there is almost a need for hundreds of thousands or even millions of facets to built up the realistic structure scene for the forest It is difficult for the radiosity method to compute so many facets. In order to make the radiosity method to simulate the forest scene at pixel scale, in the authors' research, the authors proposed one idea to simplify the structure of forest crowns, and abstract the crowns to ellipsoids. And based on the optical characteristics of the tree component and the characteristics of the internal energy transmission of photon in real crown, the authors valued the optical characteristics of ellipsoid surface facets. In the computer simulation of the forest, with the idea of geometrical optics model, the gap model is considered to get the forest canopy bidirectional reflectance at pixel scale. Comparing the computer simulation results with the GOMS model, and Multi-angle Imaging SpectroRadiometer (MISR) multi-angle remote sensing data, the simulation results are in agreement with the GOMS simulation result and MISR BRF. But there are also some problems to be solved. So the authors can conclude that the study has important value for the application of multi-angle remote sensing and the inversion of vegetation canopy structure parameters.

  3. Computer simulations of the random barrier model

    DEFF Research Database (Denmark)

    Schrøder, Thomas; Dyre, Jeppe

    2002-01-01

    A brief review of experimental facts regarding ac electronic and ionic conduction in disordered solids is given followed by a discussion of what is perhaps the simplest realistic model, the random barrier model (symmetric hopping model). Results from large scale computer simulations are presented...

  4. Evaluation of computed tomography coronary angiography in patients with a high heart rate using 16-slice spiral computed tomography with 0.37-s gantry rotation time

    International Nuclear Information System (INIS)

    Zhang, Shi-Zheng; Hu, Xiu-Hua; Zhang, Qiao-Wei; Huang, Wen-Xin

    2005-01-01

    The aim of our study is to evaluate computed tomography (CT) coronary angiography in patients with a high heart rate using 16-slice spiral CT with 0.37-s gantry rotation time. We compare the image quality of patients whose heart rates were over 70 beats per minute (bpm) with that of patients whose heart rates were 70 bpm or less. Sixty patients with various heart rates underwent retrospectively ECG-gated multislice spiral CT (MSCT) coronary angiography. Two experienced observers who were blind to the heart rates of the patients evaluated all the MSCT coronary angiographic images and calculated the assessable segments. A total of 620 out of 891 (69.6%) segments were satisfactorily visualized. On average, 10.3 coronary artery segments per patient could be evaluated. In 36 patients whose heart rates were below 70 bpm [mean 62.2 bpm±5.32 (standard deviation, SD)], the number of assessable segments was 10.72±2.02 (SD). In the other 24 patients whose heart rates were above 70 bpm [mean 78.6 bpm±8.24 (SD)], the corresponding number was 9.75±1.74 (SD). No statistically significant difference was found in these two subgroups' t test, P>0.05. The new generation of 16-slice spiral CT with 0.37-s rotation time can satisfactorily evaluate the coronary arteries of patients with high heart rates (above 70 bpm, up to 102 bpm). (orig.)

  5. Technology computer aided design simulation for VLSI MOSFET

    CERN Document Server

    Sarkar, Chandan Kumar

    2013-01-01

    Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and

  6. Computer simulations of the mechanical properties of metals

    DEFF Research Database (Denmark)

    Schiøtz, Jakob; Vegge, Tejs

    1999-01-01

    Atomic-scale computer simulations can be used to gain a better understanding of the mechanical properties of materials. In this paper we demonstrate how this can be done in the case of nanocrystalline copper, and give a brief overview of how simulations may be extended to larger length scales....... Nanocrystline metals are metals with grain sizes in the nanometre range, they have a number of technologically interesting properties such as much increased hardness and yield strength. Our simulations show that the deformation mechanisms are different in these materials than in coarse-grained materials...

  7. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  8. Optimizing Cognitive Load for Learning from Computer-Based Science Simulations

    Science.gov (United States)

    Lee, Hyunjeong; Plass, Jan L.; Homer, Bruce D.

    2006-01-01

    How can cognitive load in visual displays of computer simulations be optimized? Middle-school chemistry students (N = 257) learned with a simulation of the ideal gas law. Visual complexity was manipulated by separating the display of the simulations in two screens (low complexity) or presenting all information on one screen (high complexity). The…

  9. Definition, modeling and simulation of a grid computing system for high throughput computing

    CERN Document Server

    Caron, E; Tsaregorodtsev, A Yu

    2006-01-01

    In this paper, we study and compare grid and global computing systems and outline the benefits of having an hybrid system called dirac. To evaluate the dirac scheduling for high throughput computing, a new model is presented and a simulator was developed for many clusters of heterogeneous nodes belonging to a local network. These clusters are assumed to be connected to each other through a global network and each cluster is managed via a local scheduler which is shared by many users. We validate our simulator by comparing the experimental and analytical results of a M/M/4 queuing system. Next, we do the comparison with a real batch system and we obtain an average error of 10.5% for the response time and 12% for the makespan. We conclude that the simulator is realistic and well describes the behaviour of a large-scale system. Thus we can study the scheduling of our system called dirac in a high throughput context. We justify our decentralized, adaptive and oppor! tunistic approach in comparison to a centralize...

  10. Sensitivity Analysis of Personal Exposure Assessment Using a Computer Simulated Person

    DEFF Research Database (Denmark)

    Brohus, Henrik; Jensen, H. K.

    2009-01-01

    The paper considers uncertainties related to personal exposure assessment using a computer simulated person. CFD is used to simulate a uniform flow field around a human being to determine the personal exposure to a contaminant source. For various vertical locations of a point contaminant source...... three additional factors are varied, namely the velocity, details of the computer simulated person, and the CFD model of the wind channel. The personal exposure is found to be highly dependent on the relative source location. Variation in the range of two orders of magnitude is found. The exposure...

  11. Computer simulation of multiple dynamic photorefractive gratings

    DEFF Research Database (Denmark)

    Buchhave, Preben

    1998-01-01

    The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data. The...

  12. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    Science.gov (United States)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  13. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-01-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  14. Petascale molecular dynamics simulation using the fast multipole method on K computer

    KAUST Repository

    Ohno, Yousuke; Yokota, Rio; Koyama, Hiroshi; Morimoto, Gentaro; Hasegawa, Aki; Masumoto, Gen; Okimoto, Noriaki; Hirano, Yoshinori; Ibeid, Huda; Narumi, Tetsu; Taiji, Makoto

    2014-01-01

    In this paper, we report all-atom simulations of molecular crowding - a result from the full node simulation on the "K computer", which is a 10-PFLOPS supercomputer in Japan. The capability of this machine enables us to perform simulation of crowded cellular environments, which are more realistic compared to conventional MD simulations where proteins are simulated in isolation. Living cells are "crowded" because macromolecules comprise ∼30% of their molecular weight. Recently, the effects of crowded cellular environments on protein stability have been revealed through in-cell NMR spectroscopy. To measure the performance of the "K computer", we performed all-atom classical molecular dynamics simulations of two systems: target proteins in a solvent, and target proteins in an environment of molecular crowders that mimic the conditions of a living cell. Using the full system, we achieved 4.4 PFLOPS during a 520 million-atom simulation with cutoff of 28 Å. Furthermore, we discuss the performance and scaling of fast multipole methods for molecular dynamics simulations on the "K computer", as well as comparisons with Ewald summation methods. © 2014 Elsevier B.V. All rights reserved.

  15. Petascale molecular dynamics simulation using the fast multipole method on K computer

    KAUST Repository

    Ohno, Yousuke

    2014-10-01

    In this paper, we report all-atom simulations of molecular crowding - a result from the full node simulation on the "K computer", which is a 10-PFLOPS supercomputer in Japan. The capability of this machine enables us to perform simulation of crowded cellular environments, which are more realistic compared to conventional MD simulations where proteins are simulated in isolation. Living cells are "crowded" because macromolecules comprise ∼30% of their molecular weight. Recently, the effects of crowded cellular environments on protein stability have been revealed through in-cell NMR spectroscopy. To measure the performance of the "K computer", we performed all-atom classical molecular dynamics simulations of two systems: target proteins in a solvent, and target proteins in an environment of molecular crowders that mimic the conditions of a living cell. Using the full system, we achieved 4.4 PFLOPS during a 520 million-atom simulation with cutoff of 28 Å. Furthermore, we discuss the performance and scaling of fast multipole methods for molecular dynamics simulations on the "K computer", as well as comparisons with Ewald summation methods. © 2014 Elsevier B.V. All rights reserved.

  16. Radiation dose management for pediatric cardiac computed tomography. A report from the Image Gently 'Have-A-Heart' campaign

    International Nuclear Information System (INIS)

    Rigsby, Cynthia K.; Sammet, Christina L.; McKenney, Sarah E.; Hill, Kevin D.; Chelliah, Anjali; Einstein, Andrew J.; Han, B.K.; Robinson, Joshua D.; Slesnick, Timothy C.; Frush, Donald P.

    2018-01-01

    Children with congenital or acquired heart disease can be exposed to relatively high lifetime cumulative doses of ionizing radiation from necessary medical imaging procedures including radiography, fluoroscopic procedures including diagnostic and interventional cardiac catheterizations, electrophysiology examinations, cardiac computed tomography (CT) studies, and nuclear cardiology examinations. Despite the clinical necessity of these imaging studies, the related ionizing radiation exposure could pose an increased lifetime attributable cancer risk. The Image Gently ''Have-A-Heart'' campaign is promoting the appropriate use of medical imaging studies in children with congenital or acquired heart disease while minimizing radiation exposure. The focus of this manuscript is to provide a comprehensive review of radiation dose management and CT performance in children with congenital or acquired heart disease. (orig.)

  17. Radiation dose management for pediatric cardiac computed tomography. A report from the Image Gently 'Have-A-Heart' campaign

    Energy Technology Data Exchange (ETDEWEB)

    Rigsby, Cynthia K.; Sammet, Christina L. [Northwestern University Feinberg School of Medicine, Department of Medical Imaging 9, Ann and Robert H. Lurie Children' s Hospital of Chicago, Departments of Radiology and Pediatrics, Chicago, IL (United States); McKenney, Sarah E. [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); Hill, Kevin D. [Duke University Medical Center, Department of Pediatrics, Durham, NC (United States); Chelliah, Anjali [Columbia University Medical Center and New York-Presbyterian Hospital, Division of Pediatric Cardiology, New York, NY (United States); Einstein, Andrew J. [Columbia University Medical Center and New York-Presbyterian Hospital, Division of Cardiology, Departments of Medicine and Radiology, New York, NY (United States); Han, B.K. [Children' s Heart Clinic at The Children' s Hospitals and Clinics of Minnesota, Department of Pediatrics, Minneapolis, MN (United States); Robinson, Joshua D. [Northwestern University Feinberg School of Medicine, Division of Pediatric Cardiology, Ann and Robert H. Lurie Children' s Hospital of Chicago, Departments of Pediatrics and Radiology, Chicago, IL (United States); Slesnick, Timothy C. [Children' s Healthcare of Atlanta, Department of Pediatrics, Emory University School of Medicine, Atlanta, GA (United States); Frush, Donald P. [Duke University Medical Center, Department of Radiology, Durham, NC (United States)

    2018-01-15

    Children with congenital or acquired heart disease can be exposed to relatively high lifetime cumulative doses of ionizing radiation from necessary medical imaging procedures including radiography, fluoroscopic procedures including diagnostic and interventional cardiac catheterizations, electrophysiology examinations, cardiac computed tomography (CT) studies, and nuclear cardiology examinations. Despite the clinical necessity of these imaging studies, the related ionizing radiation exposure could pose an increased lifetime attributable cancer risk. The Image Gently ''Have-A-Heart'' campaign is promoting the appropriate use of medical imaging studies in children with congenital or acquired heart disease while minimizing radiation exposure. The focus of this manuscript is to provide a comprehensive review of radiation dose management and CT performance in children with congenital or acquired heart disease. (orig.)

  18. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  19. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  20. AFFECTIVE COMPUTING AND AUGMENTED REALITY FOR CAR DRIVING SIMULATORS

    Directory of Open Access Journals (Sweden)

    Dragoș Datcu

    2017-12-01

    Full Text Available Car simulators are essential for training and for analyzing the behavior, the responses and the performance of the driver. Augmented Reality (AR is the technology that enables virtual images to be overlaid on views of the real world. Affective Computing (AC is the technology that helps reading emotions by means of computer systems, by analyzing body gestures, facial expressions, speech and physiological signals. The key aspect of the research relies on investigating novel interfaces that help building situational awareness and emotional awareness, to enable affect-driven remote collaboration in AR for car driving simulators. The problem addressed relates to the question about how to build situational awareness (using AR technology and emotional awareness (by AC technology, and how to integrate these two distinct technologies [4], into a unique affective framework for training, in a car driving simulator.

  1. A computational model to generate simulated three-dimensional breast masses

    Energy Technology Data Exchange (ETDEWEB)

    Sisternes, Luis de; Brankov, Jovan G.; Zysk, Adam M.; Wernick, Miles N., E-mail: wernick@iit.edu [Medical Imaging Research Center, Department of Electrical and Computer Engineering, Illinois Institute of Technology, Chicago, Illinois 60616 (United States); Schmidt, Robert A. [Kurt Rossmann Laboratories for Radiologic Image Research, Department of Radiology, The University of Chicago, Chicago, Illinois 60637 (United States); Nishikawa, Robert M. [Department of Radiology, University of Pittsburgh, Pittsburgh, Pennsylvania 15213 (United States)

    2015-02-15

    Purpose: To develop algorithms for creating realistic three-dimensional (3D) simulated breast masses and embedding them within actual clinical mammograms. The proposed techniques yield high-resolution simulated breast masses having randomized shapes, with user-defined mass type, size, location, and shape characteristics. Methods: The authors describe a method of producing 3D digital simulations of breast masses and a technique for embedding these simulated masses within actual digitized mammograms. Simulated 3D breast masses were generated by using a modified stochastic Gaussian random sphere model to generate a central tumor mass, and an iterative fractal branching algorithm to add complex spicule structures. The simulated masses were embedded within actual digitized mammograms. The authors evaluated the realism of the resulting hybrid phantoms by generating corresponding left- and right-breast image pairs, consisting of one breast image containing a real mass, and the opposite breast image of the same patient containing a similar simulated mass. The authors then used computer-aided diagnosis (CAD) methods and expert radiologist readers to determine whether significant differences can be observed between the real and hybrid images. Results: The authors found no statistically significant difference between the CAD features obtained from the real and simulated images of masses with either spiculated or nonspiculated margins. Likewise, the authors found that expert human readers performed very poorly in discriminating their hybrid images from real mammograms. Conclusions: The authors’ proposed method permits the realistic simulation of 3D breast masses having user-defined characteristics, enabling the creation of a large set of hybrid breast images containing a well-characterized mass, embedded within real breast background. The computational nature of the model makes it suitable for detectability studies, evaluation of computer aided diagnosis algorithms, and

  2. A computational model to generate simulated three-dimensional breast masses

    International Nuclear Information System (INIS)

    Sisternes, Luis de; Brankov, Jovan G.; Zysk, Adam M.; Wernick, Miles N.; Schmidt, Robert A.; Nishikawa, Robert M.

    2015-01-01

    Purpose: To develop algorithms for creating realistic three-dimensional (3D) simulated breast masses and embedding them within actual clinical mammograms. The proposed techniques yield high-resolution simulated breast masses having randomized shapes, with user-defined mass type, size, location, and shape characteristics. Methods: The authors describe a method of producing 3D digital simulations of breast masses and a technique for embedding these simulated masses within actual digitized mammograms. Simulated 3D breast masses were generated by using a modified stochastic Gaussian random sphere model to generate a central tumor mass, and an iterative fractal branching algorithm to add complex spicule structures. The simulated masses were embedded within actual digitized mammograms. The authors evaluated the realism of the resulting hybrid phantoms by generating corresponding left- and right-breast image pairs, consisting of one breast image containing a real mass, and the opposite breast image of the same patient containing a similar simulated mass. The authors then used computer-aided diagnosis (CAD) methods and expert radiologist readers to determine whether significant differences can be observed between the real and hybrid images. Results: The authors found no statistically significant difference between the CAD features obtained from the real and simulated images of masses with either spiculated or nonspiculated margins. Likewise, the authors found that expert human readers performed very poorly in discriminating their hybrid images from real mammograms. Conclusions: The authors’ proposed method permits the realistic simulation of 3D breast masses having user-defined characteristics, enabling the creation of a large set of hybrid breast images containing a well-characterized mass, embedded within real breast background. The computational nature of the model makes it suitable for detectability studies, evaluation of computer aided diagnosis algorithms, and

  3. Neurosurgical simulation by interactive computer graphics on iPad.

    Science.gov (United States)

    Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki

    2014-11-01

    Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.

  4. The null-event method in computer simulation

    International Nuclear Information System (INIS)

    Lin, S.L.

    1978-01-01

    The simulation of collisions of ions moving under the influence of an external field through a neutral gas to non-zero temperatures is discussed as an example of computer models of processes in which a probe particle undergoes a series of interactions with an ensemble of other particles, such that the frequency and outcome of the events depends on internal properties of the second particles. The introduction of null events removes the need for much complicated algebra, leads to a more efficient simulation and reduces the likelihood of logical error. (Auth.)

  5. Computational fluid dynamics for sport simulation

    CERN Document Server

    2009-01-01

    All over the world sport plays a prominent role in society: as a leisure activity for many, as an ingredient of culture, as a business and as a matter of national prestige in such major events as the World Cup in soccer or the Olympic Games. Hence, it is not surprising that science has entered the realm of sports, and, in particular, that computer simulation has become highly relevant in recent years. This is explored in this book by choosing five different sports as examples, demonstrating that computational science and engineering (CSE) can make essential contributions to research on sports topics on both the fundamental level and, eventually, by supporting athletes’ performance.

  6. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  7. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  8. Computational fluid dynamics (CFD) simulation of hot air flow ...

    African Journals Online (AJOL)

    Computational Fluid Dynamics simulation of air flow distribution, air velocity and pressure field pattern as it will affect moisture transient in a cabinet tray dryer is performed using SolidWorks Flow Simulation (SWFS) 2014 SP 4.0 program. The model used for the drying process in this experiment was designed with Solid ...

  9. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    Science.gov (United States)

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  10. Monte Carlo simulation with the Gate software using grid computing

    International Nuclear Information System (INIS)

    Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.

    2009-03-01

    Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)

  11. A simulation of T-wave alternans vectocardiographic representation performed by changing the ventricular heart cells action potential duration.

    Science.gov (United States)

    Janusek, D; Kania, M; Zaczek, R; Zavala-Fernandez, H; Maniewski, R

    2014-04-01

    The presence of T wave alternans (TWA) in the surface ECG signals has been recognized as a marker of electrical instability, and is hypothesized to be related to patients at increased risk for ventricular arrhythmias. In this paper we present a TWA simulation study. The TWA phenomenon was simulated by changing the duration of the ventricular heart cells action potential. The magnitude was calculated in the surface ECG with the use of the time domain method. The spatially concordant TWA, where during one heart beat all ventricular cells display a short-duration action potential and during the next beat they exhibit a long-duration action potential, as well as the discordant TWA, where at least one region is out of phase, was simulated. The vectocardiographic representation was employed. The obtained results showed a high level of T-loop pattern and location disturbances connected to the discordant TWA simulation in contrast to the concordant one. This result may be explained by the spatial heterogeneity of the ventricular repolarization process, which could be higher for the discordant TWA than for the concordant TWA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Simulation and computation in health physics training

    International Nuclear Information System (INIS)

    Lakey, S.R.A.; Gibbs, D.C.C.; Marchant, C.P.

    1980-01-01

    The Royal Naval College has devised a number of computer aided learning programmes applicable to health physics which include radiation shield design and optimisation, environmental impact of a reactor accident, exposure levels produced by an inert radioactive gas cloud, and the prediction of radiation detector response in various radiation field conditions. Analogue computers are used on reduced or fast time scales because time dependent phenomenon are not always easily assimilated in real time. The build-up and decay of fission products, the dynamics of intake of radioactive material and reactor accident dynamics can be effectively simulated. It is essential to relate these simulations to real time and the College applies a research reactor and analytical phantom to this end. A special feature of the reactor is a chamber which can be supplied with Argon-41 from reactor exhaust gases to create a realistic gaseous contamination environment. Reactor accident situations are also taught by using role playing sequences carried out in real time in the emergency facilities associated with the research reactor. These facilities are outlined and the training technique illustrated with examples of the calculations and simulations. The training needs of the future are discussed, with emphasis on optimisation and cost-benefit analysis. (H.K.)

  13. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2012-11-01

    Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  14. GEANT4 simulations for Proton computed tomography applications

    International Nuclear Information System (INIS)

    Yevseyeva, Olga; Assis, Joaquim T. de; Evseev, Ivan; Schelin, Hugo R.; Shtejer Diaz, Katherin; Lopes, Ricardo T.

    2011-01-01

    Proton radiation therapy is a highly precise form of cancer treatment. In existing proton treatment centers, dose calculations are performed based on X-ray computed tomography (CT). Alternatively, one could image the tumor directly with proton CT (pCT). Proton beams in medical applications deal with relatively thick targets like the human head or trunk. Thus, the fidelity of proton computed tomography (pCT) simulations as a tool for proton therapy planning depends in the general case on the accuracy of results obtained for the proton interaction with thick absorbers. GEANT4 simulations of proton energy spectra after passing thick absorbers do not agree well with existing experimental data, as showed previously. The spectra simulated for the Bethe-Bloch domain showed an unexpected sensitivity to the choice of low-energy electromagnetic models during the code execution. These observations were done with the GEANT4 version 8.2 during our simulations for pCT. This work describes in more details the simulations of the proton passage through gold absorbers with varied thickness. The simulations were done by modifying only the geometry in the Hadron therapy Example, and for all available choices of the Electromagnetic Physics Models. As the most probable reasons for these effects is some specific feature in the code or some specific implicit parameters in the GEANT4 manual, we continued our study with version 9.2 of the code. Some improvements in comparison with our previous results were obtained. The simulations were performed considering further applications for pCT development. The authors want to thank CNPq, CAPES and 'Fundacao Araucaria' for financial support of this work. (Author)

  15. Influence of heart rate on image quality of 64-slice spiral computed coronary angiography and optimization on reconstruction of phase window

    International Nuclear Information System (INIS)

    Luo Xuemao; Lan Yong; Li Wei; Long Wansheng; Zhang Chaotong; Zhong Xiangyang; Yi Lan

    2009-01-01

    Objective: To evaluate the influence of heart rate on the image quality of 64-slice spiral computed coronary angiography (MSCTCA) and optimize the image reconstruction window. Methods: According to the heart rate, 86 patients were classified into 5 groups: group A, the heart rate ≤60 beat per minute(BMP); group B,61-70BMP, group C,71-80BMP, and group D>80BMP. The image quality of MSCTCA was scored 5 grades from 1-5 according to heart motion artifact. The influences of heart rate and reconstruction phase on the image quality of MSCTCA were evaluated. Results: Average heart rate was 64.4 ±10.1BMP. Diagnostic image quality (score>3) was attained in 277 of 344 segments at the best reconstruction interval. There was a significant corxelation between average heart rate and image quality, but there was no difference between relative delay (%) reconstruction and absolute delay (ms) reconstruction on the image quality. Conclusion: Reducing average heart rate is beneficial for improving the image quality. (authors)

  16. Artificial heart for humanoid robot

    Science.gov (United States)

    Potnuru, Akshay; Wu, Lianjun; Tadesse, Yonas

    2014-03-01

    A soft robotic device inspired by the pumping action of a biological heart is presented in this study. Developing artificial heart to a humanoid robot enables us to make a better biomedical device for ultimate use in humans. As technology continues to become more advanced, the methods in which we implement high performance and biomimetic artificial organs is getting nearer each day. In this paper, we present the design and development of a soft artificial heart that can be used in a humanoid robot and simulate the functions of a human heart using shape memory alloy technology. The robotic heart is designed to pump a blood-like fluid to parts of the robot such as the face to simulate someone blushing or when someone is angry by the use of elastomeric substrates and certain features for the transport of fluids.

  17. Computational biomechanics for medicine from algorithms to models and applications

    CERN Document Server

    Joldes, Grand; Nielsen, Poul; Doyle, Barry; Miller, Karol

    2017-01-01

    This volume comprises the latest developments in both fundamental science and patient-specific applications, discussing topics such as: cellular mechanics; injury biomechanics; biomechanics of heart and vascular system; medical image analysis; and both patient-specific fluid dynamics and solid mechanics simulations. With contributions from researchers world-wide, the Computational Biomechanics for Medicine series of titles provides an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements.

  18. Supporting hypothesis generation by learners exploring an interactive computer simulation

    NARCIS (Netherlands)

    van Joolingen, Wouter R.; de Jong, Ton

    1992-01-01

    Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This

  19. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  20. 3D Fluid-Structure Interaction Simulation of Aortic Valves Using a Unified Continuum ALE FEM Model

    Directory of Open Access Journals (Sweden)

    Jeannette H. Spühler

    2018-04-01

    Full Text Available Due to advances in medical imaging, computational fluid dynamics algorithms and high performance computing, computer simulation is developing into an important tool for understanding the relationship between cardiovascular diseases and intraventricular blood flow. The field of cardiac flow simulation is challenging and highly interdisciplinary. We apply a computational framework for automated solutions of partial differential equations using Finite Element Methods where any mathematical description directly can be translated to code. This allows us to develop a cardiac model where specific properties of the heart such as fluid-structure interaction of the aortic valve can be added in a modular way without extensive efforts. In previous work, we simulated the blood flow in the left ventricle of the heart. In this paper, we extend this model by placing prototypes of both a native and a mechanical aortic valve in the outflow region of the left ventricle. Numerical simulation of the blood flow in the vicinity of the valve offers the possibility to improve the treatment of aortic valve diseases as aortic stenosis (narrowing of the valve opening or regurgitation (leaking and to optimize the design of prosthetic heart valves in a controlled and specific way. The fluid-structure interaction and contact problem are formulated in a unified continuum model using the conservation laws for mass and momentum and a phase function. The discretization is based on an Arbitrary Lagrangian-Eulerian space-time finite element method with streamline diffusion stabilization, and it is implemented in the open source software Unicorn which shows near optimal scaling up to thousands of cores. Computational results are presented to demonstrate the capability of our framework.

  1. 3D Fluid-Structure Interaction Simulation of Aortic Valves Using a Unified Continuum ALE FEM Model.

    Science.gov (United States)

    Spühler, Jeannette H; Jansson, Johan; Jansson, Niclas; Hoffman, Johan

    2018-01-01

    Due to advances in medical imaging, computational fluid dynamics algorithms and high performance computing, computer simulation is developing into an important tool for understanding the relationship between cardiovascular diseases and intraventricular blood flow. The field of cardiac flow simulation is challenging and highly interdisciplinary. We apply a computational framework for automated solutions of partial differential equations using Finite Element Methods where any mathematical description directly can be translated to code. This allows us to develop a cardiac model where specific properties of the heart such as fluid-structure interaction of the aortic valve can be added in a modular way without extensive efforts. In previous work, we simulated the blood flow in the left ventricle of the heart. In this paper, we extend this model by placing prototypes of both a native and a mechanical aortic valve in the outflow region of the left ventricle. Numerical simulation of the blood flow in the vicinity of the valve offers the possibility to improve the treatment of aortic valve diseases as aortic stenosis (narrowing of the valve opening) or regurgitation (leaking) and to optimize the design of prosthetic heart valves in a controlled and specific way. The fluid-structure interaction and contact problem are formulated in a unified continuum model using the conservation laws for mass and momentum and a phase function. The discretization is based on an Arbitrary Lagrangian-Eulerian space-time finite element method with streamline diffusion stabilization, and it is implemented in the open source software Unicorn which shows near optimal scaling up to thousands of cores. Computational results are presented to demonstrate the capability of our framework.

  2. Bending and Twisting the Embryonic Heart: A Computational Model for C-Looping Based on Realistic Geometry

    Directory of Open Access Journals (Sweden)

    Yunfei eShi

    2014-08-01

    Full Text Available The morphogenetic process of cardiac looping transforms the straight heart tube into a curved tube that resembles the shape of the future four-chambered heart. Although great progress has been made in identifying the molecular and genetic factors involved in looping, the physical mechanisms that drive this process have remained poorly understood. Recent work, however, has shed new light on this complicated problem. After briefly reviewing the current state of knowledge, we propose a relatively comprehensive hypothesis for the mechanics of the first phase of looping, termed c-looping, as the straight heart tube deforms into a c-shaped tube. According to this hypothesis, differential hypertrophic growth in the myocardium supplies the main forces that cause the heart tube to bend ventrally, while regional growth and contraction in the omphalomesenteric veins (primitive atria and compressive loads exerted by the splanchnopleuric membrane drive rightward torsion. A computational model based on realistic embryonic heart geometry is used to test this hypothesis. The behavior of the model is in reasonable agreement with available experimental data from control and perturbed embryos, offering support for our hypothesis. The results also suggest, however, that several other mechanisms contribute secondarily to normal looping, and we speculate that these mechanisms play backup roles when looping is perturbed. Finally, some outstanding questions are discussed for future study.

  3. Computer simulation of the natural U 238 and U 235 radioactive series decay

    International Nuclear Information System (INIS)

    Barna, A.; Oncescu, M.

    1980-01-01

    The principles of the computer simulation of a radionuclide decay - its decay scheme adoption and codification -, and the adoption principle of a radionuclide chain in a series are applied to the natural U 238 and U 235 series radionuclide decay computer simulation. Using the computer simulation data of these two series adopted chains, the decay characteristic quantities of the series radionuclides, the gamma spectra and the basic characteristics of each of these series are determined and compared with the experimental values given in the literature. (author)

  4. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  5. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, Robert [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Matzen, M. Keith [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive

  6. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  7. Research on integrated simulation of fluid-structure system by computation science techniques

    International Nuclear Information System (INIS)

    Yamaguchi, Akira

    1996-01-01

    In Power Reactor and Nuclear Fuel Development Corporation, the research on the integrated simulation of fluid-structure system by computation science techniques has been carried out, and by its achievement, the verification of plant systems which has depended on large scale experiments is substituted by computation science techniques, in this way, it has been aimed at to reduce development costs and to attain the optimization of FBR systems. For the purpose, it is necessary to establish the technology for integrally and accurately analyzing complicated phenomena (simulation technology), the technology for applying it to large scale problems (speed increasing technology), and the technology for assuring the reliability of the results of analysis when simulation technology is utilized for the permission and approval of FBRs (verifying technology). The simulation of fluid-structure interaction, the heat flow simulation in the space with complicated form and the related technologies are explained. As the utilization of computation science techniques, the elucidation of phenomena by numerical experiment and the numerical simulation as the substitute for tests are discussed. (K.I.)

  8. Fluid Dynamics Theory, Computation, and Numerical Simulation

    CERN Document Server

    Pozrikidis, Constantine

    2009-01-01

    Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...

  9. Computer simulation of gain fluctuations in proportional counters

    International Nuclear Information System (INIS)

    Demir, Nelgun; Tapan, . Ilhan

    2004-01-01

    A computer simulation code has been developed in order to examine the fluctuation in gas amplification in wire proportional counters which are common in detector applications in particle physics experiments. The magnitude of the variance in the gain dominates the statistical portion of the energy resolution. In order to compare simulation and experimental results, the gain and its variation has been calculated numerically for the well known Aleph Inner Tracking Detector geometry. The results show that the bias voltage has a strong influence on the variance in the gain. The simulation calculations are in good agreement with experimental results. (authors)

  10. Accelerating Climate Simulations Through Hybrid Computing

    Science.gov (United States)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  11. Simulation of electronic structure Hamiltonians in a superconducting quantum computer architecture

    Energy Technology Data Exchange (ETDEWEB)

    Kaicher, Michael; Wilhelm, Frank K. [Theoretical Physics, Saarland University, 66123 Saarbruecken (Germany); Love, Peter J. [Department of Physics, Haverford College, Haverford, Pennsylvania 19041 (United States)

    2015-07-01

    Quantum chemistry has become one of the most promising applications within the field of quantum computation. Simulating the electronic structure Hamiltonian (ESH) in the Bravyi-Kitaev (BK)-Basis to compute the ground state energies of atoms/molecules reduces the number of qubit operations needed to simulate a single fermionic operation to O(log(n)) as compared to O(n) in the Jordan-Wigner-Transformation. In this work we will present the details of the BK-Transformation, show an example of implementation in a superconducting quantum computer architecture and compare it to the most recent quantum chemistry algorithms suggesting a constant overhead.

  12. Application of CT-PSF-based computer-simulated lung nodules for evaluating the accuracy of computer-aided volumetry.

    Science.gov (United States)

    Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji

    2012-07-01

    With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.

  13. A simulation study of the reaction of human heart to biphasic electrical shocks

    Directory of Open Access Journals (Sweden)

    Seemann Gunnar

    2004-06-01

    Full Text Available Abstract Background This article presents a study, which examines the effects of biphasic electrical shocks on human ventricular tissue. The effects of this type of shock are not yet fully understood. Animal experiments showed the superiority of biphasic shocks over monophasic ones in defibrillation. A mathematical computer simulation can increase the knowledge of human heart behavior. Methods The research presented in this article was done with different models representing a three-dimensional wedge of ventricular myocardium. The electrophysiology was described with Priebe-Beuckelmann model. The realistic fiber twist, which is specific to human myocardium was included. Planar electrodes were placed at the ends of the longest side of the virtual cardiac wedge, in a bath medium. They were sources of electrical shocks, which varied in magnitude from 0.1 to 5 V. In a second arrangement ring electrodes were placed directly on myocardium for getting a better view on secondary electrical sources. The electrical reaction of the tissue was generated with a bidomain model. Results The reaction of the tissue to the electrical shock was specific to the initial imposed characteristics. Depolarization appeared in the first 5 ms in different locations. A further study of the cardiac tissue behavior revealed, which features influence the response of the considered muscle. It was shown that the time needed by the tissue to be totally depolarized is much shorter when a biphasic shock is applied. Each simulation ended only after complete repolarization was achieved. This created the possibility of gathering information from all states corresponding to one cycle of the cardiac rhythm. Conclusions The differences between the reaction of the homogeneous tissue and a tissue, which contains cleavage planes, reveals important aspects of superiority of biphasic pulses. ...

  14. Quality assurance for computed-tomography simulators and the computed-tomography-simulation process: Report of the AAPM Radiation Therapy Committee Task Group No. 66

    International Nuclear Information System (INIS)

    Mutic, Sasa; Palta, Jatinder R.; Butker, Elizabeth K.; Das, Indra J.; Huq, M. Saiful; Loo, Leh-Nien Dick; Salter, Bill J.; McCollough, Cynthia H.; Van Dyk, Jacob

    2003-01-01

    This document presents recommendations of the American Association of Physicists in Medicine (AAPM) for quality assurance of computed-tomography- (CT) simulators and CT-simulation process. This report was prepared by Task Group No. 66 of the AAPM Radiation Therapy Committee. It was approved by the Radiation Therapy Committee and by the AAPM Science Council

  15. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  16. Inquiry-Based Whole-Class Teaching with Computer Simulations in Physics

    NARCIS (Netherlands)

    Rutten, N.P.G.; van der Veen, Jan T.; van Joolingen, Wouter

    2015-01-01

    In this study we investigated the pedagogical context of whole-class teaching with computer simulations. We examined relations between the attitudes and learning goals of teachers and their students regarding the use of simulations in whole-class teaching, and how teachers implement these

  17. Study on computer-aided simulation procedure for multicomponent separating cascade

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro

    1982-11-01

    The present report reviews the author's study on the computer-aided simulation procedure for a multicomponent separating cascade. As a conclusion, two very powerful simulation procedures have been developed for cascades composed of separating elements whose separation factors are very large. They are applicable in cases where interstage flow rates are input variables for the calculation and stage separation factors are given either as constants or as functions of compositions of the up and down streams. As an application of the new procedure, a computer-aided simulation study has been performed for hydrogen isotope separating cascades by porous membrane method. A cascade system configuration is developed and pertinent design specifications are determined in an example case of the feed conditions and separation requirements. (author)

  18. Computer-simulated images of icosahedral, pentagonal and decagonal clusters of atoms

    International Nuclear Information System (INIS)

    Peng JuLin; Bursill, L.A.

    1989-01-01

    The aim of this work was to assess, by computer-simulation the sensitivity of high-resolution electron microscopy (HREM) images for a set of icosahedral and decagonal clusters, containing 50-400 atoms. An experimental study of both crystalline and quasy-crystalline alloys of A1(Si)Mn is presented, in which carefully-chosen electron optical conditions were established by computer simulation then used to obtain high quality images. It was concluded that while there is a very significant degree of model sensitiveness available, direct inversion from image to structure is not at realistic possibility. A reasonable procedure would be to record experimental images of known complex icosahedral alloys, in a crystalline phase, then use the computer-simulations to identify fingerprint imaging conditions whereby certain structural elements could be identified in images of quasi-crystalline or amorphous specimens. 27 refs., 12 figs., 1 tab

  19. Using Computer Simulations in Chemistry Problem Solving

    Science.gov (United States)

    Avramiotis, Spyridon; Tsaparlis, Georgios

    2013-01-01

    This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…

  20. INTRA- AND INTER-OBSERVER RELIABILITY IN SELECTION OF THE HEART RATE DEFLECTION POINT DURING INCREMENTAL EXERCISE: COMPARISON TO A COMPUTER-GENERATED DEFLECTION POINT

    Directory of Open Access Journals (Sweden)

    Bridget A. Duoos

    2002-12-01

    Full Text Available This study was designed to 1 determine the relative frequency of occurrence of a heart rate deflection point (HRDP, when compared to a linear relationship, during progressive exercise, 2 measure the reproducibility of a visual assessment of a heart rate deflection point (HRDP, both within and between observers 3 compare visual and computer-assessed deflection points. Subjects consisted of 73 competitive male cyclists with mean age of 31.4 ± 6.3 years, mean height 178.3 ± 4.8 cm. and weight 74.0 ± 4.4 kg. Tests were conducted on an electrically-braked cycle ergometer beginning at 25 watts and progressing 25 watts per minute to fatigue. Heart Rates were recorded the last 10 seconds of each stage and at fatigue. Scatter plots of heart rate versus watts were computer-generated and given to 3 observers on two different occasions. A computer program was developed to assess if data points were best represented by a single line or two lines. The HRDP represented the intersection of the two lines. Results of this study showed that 1 computer-assessed HRDP showed that 44 of 73 subjects (60.3% had scatter plots best represented by a straight line with no HRDP 2in those subjects having HRDP, all 3 observers showed significant differences(p = 0.048, p = 0.007, p = 0.001 in reproducibility of their HRDP selection. Differences in HRDP selection were significant for two of the three comparisons between observers (p = 0.002, p = 0.305, p = 0.0003 Computer-generated HRDP was significantly different than visual HRDP for 2 of 3 observers (p = 0.0016, p = 0.513, p = 0.0001. It is concluded that 1 HRDP occurs in a minority of subjects 2 significant differences exist, both within and between observers, in selection of HRDP and 3 differences in agreement between visual and computer-generated HRDP would indicate that, when HRDP exists, it should be computer-assessed

  1. Variation in effectiveness of a cardiac auscultation training class with a cardiology patient simulator among heart sounds and murmurs.

    Science.gov (United States)

    Kagaya, Yutaka; Tabata, Masao; Arata, Yutaro; Kameoka, Junichi; Ishii, Seiichi

    2017-08-01

    Effectiveness of simulation-based education in cardiac auscultation training is controversial, and may vary among a variety of heart sounds and murmurs. We investigated whether a single auscultation training class using a cardiology patient simulator for medical students provides competence required for clinical clerkship, and whether students' proficiency after the training differs among heart sounds and murmurs. A total of 324 fourth-year medical students (93-117/year for 3 years) were divided into groups of 6-8 students; each group participated in a three-hour training session using a cardiology patient simulator. After a mini-lecture and facilitated training, each student took two different tests. In the first test, they tried to identify three sounds of Category A (non-split, respiratory split, and abnormally wide split S2s) in random order, after being informed that they were from Category A. They then did the same with sounds of Category B (S3, S4, and S3+S4) and Category C (four heart murmurs). In the second test, they tried to identify only one from each of the three categories in random order without any category information. The overall accuracy rate declined from 80.4% in the first test to 62.0% in the second test (pauscultation training. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  2. Adding computationally efficient realism to Monte Carlo turbulence simulation

    Science.gov (United States)

    Campbell, C. W.

    1985-01-01

    Frequently in aerospace vehicle flight simulation, random turbulence is generated using the assumption that the craft is small compared to the length scales of turbulence. The turbulence is presumed to vary only along the flight path of the vehicle but not across the vehicle span. The addition of the realism of three-dimensionality is a worthy goal, but any such attempt will not gain acceptance in the simulator community unless it is computationally efficient. A concept for adding three-dimensional realism with a minimum of computational complexity is presented. The concept involves the use of close rational approximations to irrational spectra and cross-spectra so that systems of stable, explicit difference equations can be used to generate the turbulence.

  3. Trends in Social Science: The Impact of Computational and Simulative Models

    Science.gov (United States)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  4. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  5. Interactive simulation of nuclear power systems using a dedicated minicomputer - computer graphics facility

    International Nuclear Information System (INIS)

    Tye, C.; Sezgen, A.O.

    1980-01-01

    The design of control systems and operational procedures for large scale nuclear power plant poses a difficult optimization problem requiring a lot of computational effort. Plant dynamic simulation using digital minicomputers offers the prospect of relatively low cost computing and when combined with graphical input/output provides a powerful tool for studying such problems. The paper discusses the results obtained from a simulation study carried out at the Computer Graphics Unit of the University of Manchester using a typical station control model for an Advanced Gas Cooled reactor. Particular reference is placed on the use of computer graphics for information display, parameter and control system optimization and techniques for using graphical input for defining and/or modifying the control system topology. Experience gained from this study has shown that a relatively modest minicomputer system can be used for simulating large scale dynamic systems and that highly interactive computer graphics can be used to advantage to relieve the designer of many of the tedious aspects of simulation leaving him free to concentrate on the more creative aspects of his work. (author)

  6. Computational simulation of the biomass gasification process in a fluidized bed reactor

    International Nuclear Information System (INIS)

    Rojas Mazaira, Leorlen Y.; Gamez Rodriguez, Abel; Andrade Gregori, Maria Dolores; Armas Cardona, Raul

    2009-01-01

    In an agro-industrial country as Cuba many residues of cultivation like the rice and the cane of sugar take place, besides the forest residues in wooded extensions. Is an interesting application for all this biomass, the gasification technology, by its high efficiency and its positive environmental impact. The computer simulation appears like a useful tool in the researches of parameters of operation of a gas- emitting, because it reduces the number of experiments to realise and the cost of the researches. In the work the importance of the application of the computer simulation is emphasized to anticipate the hydrodynamic behavior of fluidized bed and of the process of combustion of the biomass for different residues and different conditions of operation. A model using CFD for the simulation of the process of combustion in a gas- emitting of biomass sets out of fluidized bed, the hydrodynamic parameters of the multiphasic flow from the elaboration of a computer simulator that allows to form and to vary the geometry of the reactor, as well as the influence of the variation of magnitudes are characterized such as: speed, diameter of the sand and equivalent reason. Experimental results in cylindrical channels appear, to complete the study of the computer simulation realised in 2D. (author)

  7. Computer simulation of hopper flow

    International Nuclear Information System (INIS)

    Potapov, A.V.; Campbell, C.S.

    1996-01-01

    This paper describes two-dimensional computer simulations of granular flow in plane hoppers. The simulations can reproduce an experimentally observed asymmetric unsteadiness for monodispersed particle sizes, but also could eliminate it by adding a small amount of polydispersity. This appears to be a result of the strong packings that may be formed by monodispersed particles and is thus a noncontinuum effect. The internal stress state was also sampled, which among other things, allows an evaluation of common assumptions made in granular material models. These showed that the internal friction coefficient is far from a constant, which is in contradiction to common models based on plasticity theory which assume that the material is always at the point of imminent yield. Furthermore, it is demonstrated that rapid granular flow theory, another common modeling technique, is inapplicable to this problem even near the exit where the flow is moving its fastest. copyright 1996 American Institute of Physics

  8. Computer simulation of the NASA water vapor electrolysis reactor

    Science.gov (United States)

    Bloom, A. M.

    1974-01-01

    The water vapor electrolysis (WVE) reactor is a spacecraft waste reclamation system for extended-mission manned spacecraft. The WVE reactor's raw material is water, its product oxygen. A computer simulation of the WVE operational processes provided the data required for an optimal design of the WVE unit. The simulation process was implemented with the aid of a FORTRAN IV routine.

  9. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist.

    Directory of Open Access Journals (Sweden)

    Brian Drawert

    2016-12-01

    Full Text Available We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.

  10. Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter V.; Tryggvason, Tryggvi

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution will be introduced for improvement of the predictions of both the energy consumption and the indoor environment. The building energy performance...

  11. Computer simulation of a 3-phase induction motor

    International Nuclear Information System (INIS)

    Memon, N.A.; Unsworth, P.J.

    2004-01-01

    Computer Simulation of a 3-phase squirrel-cage induction motor is presented in Microsoft QBASIC for understanding trends and various operational modes of an induction motor. Thyristor fed, phase controlled induction motor (three-wire) model has been simulated. In which voltage is applied to the motor stator winding through back-to-back connected thyristors as controlled switches in series with the stator. The simulated induction motor system opens up towards a wide range of investigation/analysis options for research and development work in the field. Key features of the simulation performed are highlighted for development of better understanding of the work done. Complete study of an Induction Motor, starting modes in terms the voltage/current, torque/speed characteristics and their graphical representation produced is presented. Ideal agreement of the simulation results with the notional outcome encourages users to go ahead for various hardware development projects based on the study through the simulation. (author)

  12. 20170312 - Computer Simulation of Developmental ...

    Science.gov (United States)

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  13. Advances in Computational Fluid-Structure Interaction and Flow Simulation Conference

    CERN Document Server

    Takizawa, Kenji

    2016-01-01

    This contributed volume celebrates the work of Tayfun E. Tezduyar on the occasion of his 60th birthday. The articles it contains were born out of the Advances in Computational Fluid-Structure Interaction and Flow Simulation (AFSI 2014) conference, also dedicated to Prof. Tezduyar and held at Waseda University in Tokyo, Japan on March 19-21, 2014. The contributing authors represent a group of international experts in the field who discuss recent trends and new directions in computational fluid dynamics (CFD) and fluid-structure interaction (FSI). Organized into seven distinct parts arranged by thematic topics, the papers included cover basic methods and applications of CFD, flows with moving boundaries and interfaces, phase-field modeling, computer science and high-performance computing (HPC) aspects of flow simulation, mathematical methods, biomedical applications, and FSI. Researchers, practitioners, and advanced graduate students working on CFD, FSI, and related topics will find this collection to be a defi...

  14. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  15. Mathematical and computational modeling and simulation fundamentals and case studies

    CERN Document Server

    Moeller, Dietmar P F

    2004-01-01

    Mathematical and Computational Modeling and Simulation - a highly multi-disciplinary field with ubiquitous applications in science and engineering - is one of the key enabling technologies of the 21st century. This book introduces to the use of Mathematical and Computational Modeling and Simulation in order to develop an understanding of the solution characteristics of a broad class of real-world problems. The relevant basic and advanced methodologies are explained in detail, with special emphasis on ill-defined problems. Some 15 simulation systems are presented on the language and the logical level. Moreover, the reader can accumulate experience by studying a wide variety of case studies. The latter are briefly described within the book but their full versions as well as some simulation software demos are available on the Web. The book can be used for University courses of different level as well as for self-study. Advanced sections are marked and can be skipped in a first reading or in undergraduate courses...

  16. The Fraunhofer Quantum Computing Portal - www.qc.fraunhofer.de: A web-based simulator of quantum computing processes

    OpenAIRE

    Rosé, H.; Asselmeyer-Maluga, T.; Kolbe, M.; Niehörster, F.; Schramm, A.

    2004-01-01

    Fraunhofer FIRST develops a computing service and collaborative workspace providing a convenient tool for simulation and investigation of quantum algorithms. To broaden the twenty qubit limit of workstation-based simulations to the next qubit decade we provide a dedicated high memorized Linux cluster with fast Myrinet interconnection network together with a adapted parallel simulator engine. This simulation service supplemented by a collaborative workspace is usable everywhere via web interfa...

  17. Computer simulation of high resolution transmission electron micrographs: theory and analysis

    International Nuclear Information System (INIS)

    Kilaas, R.

    1985-03-01

    Computer simulation of electron micrographs is an invaluable aid in their proper interpretation and in defining optimum conditions for obtaining images experimentally. Since modern instruments are capable of atomic resolution, simulation techniques employing high precision are required. This thesis makes contributions to four specific areas of this field. First, the validity of a new method for simulating high resolution electron microscope images has been critically examined. Second, three different methods for computing scattering amplitudes in High Resolution Transmission Electron Microscopy (HRTEM) have been investigated as to their ability to include upper Laue layer (ULL) interaction. Third, a new method for computing scattering amplitudes in high resolution transmission electron microscopy has been examined. Fourth, the effect of a surface layer of amorphous silicon dioxide on images of crystalline silicon has been investigated for a range of crystal thicknesses varying from zero to 2 1/2 times that of the surface layer

  18. Computational Physics Simulation of Classical and Quantum Systems

    CERN Document Server

    Scherer, Philipp O. J

    2010-01-01

    This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills.

  19. An FPGA computing demo core for space charge simulation

    International Nuclear Information System (INIS)

    Wu, Jinyuan; Huang, Yifei

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  20. An FPGA computing demo core for space charge simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  1. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  2. Computer simulation system of neural PID control on nuclear reactor

    International Nuclear Information System (INIS)

    Chen Yuzhong; Yang Kaijun; Shen Yongping

    2001-01-01

    Neural network proportional integral differential (PID) controller on nuclear reactor is designed, and the control process is simulated by computer. The simulation result show that neutral network PID controller can automatically adjust its parameter to ideal state, and good control result can be gotten in reactor control process

  3. Computer simulation of ion recombination in irradiated nonpolar liquids

    International Nuclear Information System (INIS)

    Bartczak, W.M.; Hummel, A.

    1986-01-01

    A review on the results of computer simulation of the diffusion controlled recombination of ions is presented. The ions generated in clusters of two and three pairs of oppositely charged ions were considered. The recombination kinetics and the ion escape probability at infinite time with and without external electric field were computed. These results are compared with the calculations based on the single-pair theory. (athor)

  4. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  5. Computer Simulation of Multidimensional Archaeological Artefacts

    Directory of Open Access Journals (Sweden)

    Vera Moitinho de Almeida

    2013-11-01

    Full Text Available The main purpose of this ongoing research is to understand possible function(s of archaeological artefacts through Reverse Engineering processes. In addition, we intend to provide new data, as well as possible explications of the archaeological record according to what it expects about social activities and working processes, by simulating the potentialities of such actions in terms of input-output relationships. Our project focuses on the Neolithic lakeside site of La Draga (Banyoles, Catalonia. In this presentation we will begin by providing a clear overview of the major guidelines used to capture and process 3D digital data of several wooden artefacts. Then, we shall present the use of semi-automated relevant feature extractions. Finally, we intend to share preliminary computer simulation issues.

  6. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  7. Detection of intra-cardiac thrombi and congestive heart failure in cats using computed tomographic angiography.

    Science.gov (United States)

    Vititoe, Kyle P; Fries, Ryan C; Joslyn, Stephen; Selmic, Laura E; Howes, Mark; Vitt, Jordan P; O'Brien, Robert T

    2018-04-16

    Arterial thromboembolism is a life-threatening condition in cats most commonly secondary to cardiac disease. Echocardiography is the reference standard to evaluate for presence of a thrombus. In humans, computed tomographic (CT) angiography is becoming widely used to detect left atrial thrombi precluding the use of sedation. The purpose of this prospective, controlled, methods comparison pilot study was threefold: (1) describe new CT angiography protocol used in awake cats with cardiac disease and congestive heart failure; (2) determine accuracy of continuous and dynamic acquisition CT angiography to identify and characterize cardiac thrombi from spontaneous echocardiographic contrast using transthoracic echocardiography as our reference standard; (3) identify known negative prognostic factors and comorbidities of the thorax that CT angiography may provide that complement or supersede echocardiographic examination. Fourteen cats with heart disease were recruited; 7 with thrombi and 7 with spontaneous echocardiographic contrast. Echocardiography and awake CT angiography were performed using a microdose of contrast. Six of 7 thrombi were identified on CT angiography as filling defects by at least one reviewer within the left auricle (n = 6) and right heart (n = 1). Highest sensitivity (71.4%) was in continuous phase and highest specificity (85.7%) was in dynamic studies with fair to moderate interobserver agreement (0.38 and 0.44). CT angiography identified prognostic cardiac information (left atrial enlargement, congestive heart failure, arterial thromboembolism) and comorbidities (suspected idiopathic pulmonary fibrosis, asthma). This study indicates CT angiography can readily identify cardiac thrombi, important prognostic information and comorbidities, and can be safely performed in cats with cardiac disease and congestive heart failure. © 2018 American College of Veterinary Radiology.

  8. Adult congenital heart disease imaging with second-generation dual-source computed tomography: initial experiences and findings.

    Science.gov (United States)

    Ghoshhajra, Brian B; Sidhu, Manavjot S; El-Sherief, Ahmed; Rojas, Carlos; Yeh, Doreen Defaria; Engel, Leif-Christopher; Liberthson, Richard; Abbara, Suhny; Bhatt, Ami

    2012-01-01

    Adult congenital heart disease patients present a unique challenge to the cardiac imager. Patients may present with both acute and chronic manifestations of their complex congenital heart disease and also require surveillance for sequelae of their medical and surgical interventions. Multimodality imaging is often required to clarify their anatomy and physiology. Radiation dose is of particular concern in these patients with lifelong imaging needs for their chronic disease. The second-generation dual-source scanner is a recently available advanced clinical cardiac computed tomography (CT) scanner. It offers a combination of the high-spatial resolution of modern CT, the high-temporal resolution of dual-source technology, and the wide z-axis coverage of modern cone-beam geometry CT scanners. These advances in technology allow novel protocols that markedly reduce scan time, significantly reduce radiation exposure, and expand the physiologic imaging capabilities of cardiac CT. We present a case series of complicated adult congenital heart disease patients imaged by the second-generation dual-source CT scanner with extremely low-radiation doses and excellent image quality. © 2012 Wiley Periodicals, Inc.

  9. Computer simulation of complexity in plasmas

    International Nuclear Information System (INIS)

    Hayashi, Takaya; Sato, Tetsuya

    1998-01-01

    By making a comprehensive comparative study of many self-organizing phenomena occurring in magnetohydrodynamics and kinetic plasmas, we came up with a hypothetical grand view of self-organization. This assertion is confirmed by a recent computer simulation for a broader science field, specifically, the structure formation of short polymer chains, where the nature of the interaction is completely different from that of plasmas. It is found that the formation of the global orientation order proceeds stepwise. (author)

  10. The value of flat-detector computed tomography during catheterisation of congenital heart disease

    Energy Technology Data Exchange (ETDEWEB)

    Gloeckler, Martin [University Hospital Erlangen, Department of Pediatric Cardiology, Erlangen (Germany); Friedrich-Alexander University Erlangen-Nuernberg, Department of Pediatric Cardiology, Erlangen (Germany); Koch, Andreas; Greim, Verena; Shabaiek, Amira; Dittrich, Sven [University Hospital Erlangen, Department of Pediatric Cardiology, Erlangen (Germany); Rueffer, Andre; Cesnjevar, Robert [University Hospital Erlangen, Department of Congenital Heart Surgery, Erlangen (Germany); Achenbach, Stephan [University Hospital Erlangen, Department of Cardiology, Erlangen (Germany)

    2011-12-15

    To analyse the diagnostic utility of flat-detector computed tomography imaging (FD-CT) in patients with congenital heart disease, including the value of image fusion to overlay three-dimensional (3D) reconstructions on fluoroscopic images during catheter-based interventions. We retrospectively analysed 62 consecutive paediatric patients in whom FD-CT was used during catheterisation of congenital heart disease. Expert operators rated the clinical value of FD-CT over conventional fluoroscopic imaging. Added radiation exposure and contrast medium volume were evaluated. During a 12-month period, FD-CT was performed in 62 out of 303 cardiac catheterisations. Median patient age was 3.5 years. In 32/62 cases, FD-CT was used for diagnostic purposes, in 30/62 cases it was used in the context of interventions. Diagnostic utility was never rated as ''misleading''. It was classified as ''not useful'' in six cases (9.7%), ''useful'' in 18 cases (29.0%), ''very useful'' in 37 cases (59.7%) and ''essential'' in one case (1.6%). The median added dose-area product was 111.0 {mu}Gym{sup 2}, the required additional quantity of contrast medium was 1.6 ml/kg. FD-CT provides useful diagnostic information in most of the patients investigated for congenital heart disease. The added radiation exposure and contrast medium volume are reasonable. (orig.)

  11. Computational physics. Simulation of classical and quantum systems

    Energy Technology Data Exchange (ETDEWEB)

    Scherer, Philipp O.J. [TU Muenchen (Germany). Physikdepartment T38

    2010-07-01

    This book encapsulates the coverage for a two-semester course in computational physics. The first part introduces the basic numerical methods while omitting mathematical proofs but demonstrating the algorithms by way of numerous computer experiments. The second part specializes in simulation of classical and quantum systems with instructive examples spanning many fields in physics, from a classical rotor to a quantum bit. All program examples are realized as Java applets ready to run in your browser and do not require any programming skills. (orig.)

  12. Computational Simulation of a Water-Cooled Heat Pump

    Science.gov (United States)

    Bozarth, Duane

    2008-01-01

    A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

  13. Detailed computer simulation of damage accumulation in ion irradiated crystalline targets

    International Nuclear Information System (INIS)

    Jaraiz, M.; Arias, J.; Bailon, L.A.; Barbolla, J.J.

    1993-01-01

    A new version for the collision cascade simulation program MARLOWE is presented. This version incorporates damage build-up in full detail, i.e every interstitial and vacancy generated is retained throughout the simulation and can become a target in subsequent collisions, unless they recombine at some stage during the implantation. Vacancy-interstitial recombination is simulated by annihilating those pairs whose radius is less than a specified recombination radius. Also, stopped atoms are moved to their nearest lattice interstitial site if it is not occupied. In this way, a fully physical simulation can be carried out in detail, thus preserving a valuable feature of MARLOWE. To overcome the prohibitive computation time and memory required, a scheme has been followed to handle in a suitable way the data generated as the simulation proceeds. The model is described. Examples of memory and computation time requirements and damage accumulation effects on channelling in ion implantation are also presented. (Author)

  14. PUMN: part I of the WINERY radiation damage computer simulation system

    International Nuclear Information System (INIS)

    Kuspa, J.P.; Edwards, D.R.; Tsoulfanidis, N.

    1976-01-01

    Results of computer work to simulate the response of crystalline materials to radiation are presented. To organize this and future work into a long range program of investigation, the WINERY Radiation Damage Computer Simulation System is proposed. The WINERY system is designed to solve the entire radiation damage problem from the incident radiation to the property changes which occur in the material, using a set of interrelated computer programs. One portion of the system, the PUMN program, has been used to obtain important radiation damage results with Fe 3 Al crystal. PUMN simulates the response of the atoms in a crystal to a knock-on atom. It yields the damage configuration of the crystal by considering the dynamic interaction of all the atoms of the computational cell, up to 1000 atoms. The PUMN program provides the WINERY system with results for the number of displacements, N/sub d/, due to knock-on atoms with various energies. The values of N/sub d/ for Fe 3 Al were obtained at two different energies, 100 and 500 eV, for a variety of initial directions. These values are to be used to form a table of results for use in WINERY

  15. The impact of central lung distance, maximal heart distance, and radiation technique on the volumetric dose of the lung and heart for intact breast radiation

    International Nuclear Information System (INIS)

    Kong, F.-M.; Klein, Eric E.; Bradley, Jeffrey D.; Mansur, David B.; Taylor, Marie E.; Perez, Carlos A.; Myerson, Robert J.; Harms, William B.

    2002-01-01

    Purpose: To investigate the impact of radiographic parameter and radiation technique on the volumetric dose of lung and heart for intact breast radiation. Methods and Materials: Forty patients with both two-dimensional (2D) and computed tomographic (CT) simulations were enrolled in the study. Central lung distance (CLD), maximal heart distance (MHD), and maximal heart length (MHL) were measured under virtual simulation. Four plans were compared for each patient. Plan A used a traditional 2D tangential setup. Plan B used clinical target volume (CTV) based three-dimensional (3D) planning. Both plans C and D used a combination of a medial breast field with shallow tangents. Plan D is a further modification of plan C. Results: Under the traditional tangential setup, the mean ipsilateral lung dose and volume at 20, 30, and 40 Gy correlated linearly with CLD (R = 0.85∼0.91). The mean ipsilateral lung dose (Gy) approximated 4 times the CLD value (cm), whereas the percentage volume (%) of ipsilateral lung at 20, 30, and 40 Gy was about 10 times the CLD (cm). The mean heart dose and percentage volume at 20, 30, and 40 Gy correlated with MHD (R = 0.76∼0.80) and MHL (R 0.65∼0.75). The mean heart dose (Gy) approximated 3 times the MHD value (cm), and the percentage volume (%) of the heart at 10, 20, 30, and 40 Gy was about 6 times MHD (cm). Radiation technique impacted lung and heart dose. The 3D tangential plan (plan B) failed to reduce the volumetric dose of lung and heart from that of the 2D plan (plan A). The medial breast techniques (plans C and D) significantly decreased the volume of lung and heart receiving high doses (30 and 40 Gy). Plan D further decreased the 20 Gy volumes. By use of the medial breast technique, the lung and heart dose were not impacted by original CLD and MHD/MHL. Therefore, the improvement from the tangential technique was more remarkable for patients with CLD ≥ 3.0 cm (p<0.001). Conclusions: The CLD and MHD impact the volumetric dose of

  16. Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer

    Science.gov (United States)

    Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki

    2017-12-01

    This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.

  17. Computer simulations of atomic collisions in solids with special emphasis on sputtering

    International Nuclear Information System (INIS)

    Andersen, H.H.

    1986-01-01

    Computer simulations of atomic collisions in solids are traditionally divided into fully interacting or molecular dynamics (MD) simulations on the one side and simulations based on the binary collision approximation (BCA) on the other. The historical development of both branches is followed and other dichotomies viz. between static and dynamic target models and between models using crystalline and amorphous targets are introduced. The influence of the main input parameters, viz. interatomic potentials, surface- and bulk-binding energies and inelasticity is discussed before selected results are treated. Here, results for non-linear effects, clusters, fluctuations and for angular distributions are presented. The review is concluded with a discussion of the influence of computer developments on future simulations. With 392 refs

  18. Computed tomography for imaging of pediatric congenital heart disease; Die Computertomographie bei der Bildgebung von Kindern mit kongenitalen Herzvitien

    Energy Technology Data Exchange (ETDEWEB)

    Glaser-Gallion, N.; Stinn, B.; Wildermuth, S.; Leschka, S. [Kantonsspital St.Gallen, Universitaet Zuerich, Institut fuer Radiologie, St. Gallen (Switzerland); Alkadhi, H. [Universitaetsspital Zuerich, Institut fuer Diagnostisch und Interventionelle Radiologie, Zuerich (Switzerland); Lell, M. [Universitaetsklinikum Erlangen, Institut fuer Radiologie, Erlangen (Germany); Goo, H.W. [University of Ulsan, College of Medicine, Asan Medical Center, Department of Radiology, Seoul (Korea, Republic of); Paul, J.F. [Marie Lannelongue Hospital, Radiology Unit, Plessis Robinson (France)

    2011-01-15

    Congenital heart diseases are the most common congenital abnormalities of development. In general, echocardiography and cardiac catheter angiography are considered the gold standard for the evaluation of congenital heart disease. Cardiac magnetic resonance imaging has become an important supplementary imaging modality because of its ability to provide an accurate morphological and functional evaluation. The role of cardiac computed tomography in the imaging of patients with congenital heart disease is becoming increasingly more important due to the development of low radiation dose protocols and improvements in the spatial and temporal resolution. In the preoperative depiction and follow-up after surgical repair of congenital heart diseases, cardiac computed tomography provides detailed information of the heart, the venous and arterial pulmonary circulation as well as systemic arteries. This article reviews the technical aspects of cardiac CT and the modification of examination protocols according to the expected pathology and patient age. The potentials and limitations of the various radiation dose reduction strategies are outlined. (orig.) [German] Kongenitale Herzfehler sind die haeufigsten kongenitalen Fehlbildungen. Echokardiographie und Katheterangiographie gelten allgemein als Goldstandard zur Abklaerung angeborener Herzerkrankungen. Die Magnetresonanztomographie ist aufgrund ihrer Faehigkeit, Herzvitien morphologisch und funktionell zu charakterisieren, als ein wichtiges ergaenzendes Verfahren anzusehen. Durch mehr und mehr dosissparende Untersuchungsprotokolle der neuesten Geraetegenerationen und eine gleichzeitig bessere zeitliche und raeumliche Aufloesung findet die Computertomographie zunehmend Eingang in die Abklaerung kongenitaler Herzfehler. In der praeoperativen Planung und der postoperativen Kontrolle erlaubt sie eine uebersichtliche Darstellung komplexer Fehlbildung nicht nur des Herzens, sondern auch der pulmonalvenoesen und -arteriellen

  19. Simulation of the effect of rogue ryanodine receptors on a calcium wave in ventricular myocytes with heart failure

    International Nuclear Information System (INIS)

    Lu, Luyao; Xia, Ling; Ye, Xuesong; Cheng, Heping

    2010-01-01

    Calcium homeostasis is considered to be one of the most important factors for the contraction and relaxation of the heart muscle. However, under some pathological conditions, such as heart failure (HF), calcium homeostasis is disordered, and spontaneous waves may occur. In this study, we developed a mathematical model of formation and propagation of a calcium wave based upon a governing system of diffusion–reaction equations presented by Izu et al (2001 Biophys. J. 80 103–20) and integrated non-clustered or 'rogue' ryanodine receptors (rogue RyRs) into a two-dimensional (2D) model of ventricular myocytes isolated from failing hearts in which sarcoplasmic reticulum (SR) Ca 2+ pools are partially unloaded. The model was then used to simulate the effect of rogue RyRs on initiation and propagation of the calcium wave in ventricular myocytes with HF. Our simulation results show that rogue RyRs can amplify the diastolic SR Ca 2+ leak in the form of Ca 2+ quarks, increase the probability of occurrence of spontaneous Ca 2+ waves even with smaller SR Ca 2+ stores, accelerate Ca 2+ wave propagation, and hence lead to delayed afterdepolarizations (DADs) and cardiac arrhythmia in the diseased heart. This investigation suggests that incorporating rogue RyRs in the Ca 2+ wave model under HF conditions provides a new view of Ca 2+ dynamics that could not be mimicked by adjusting traditional parameters involved in Ca 2+ release units and other ion channels, and contributes to understanding the underlying mechanism of HF

  20. Simulation of the effect of rogue ryanodine receptors on a calcium wave in ventricular myocytes with heart failure

    Science.gov (United States)

    Lu, Luyao; Xia, Ling; Ye, Xuesong; Cheng, Heping

    2010-06-01

    Calcium homeostasis is considered to be one of the most important factors for the contraction and relaxation of the heart muscle. However, under some pathological conditions, such as heart failure (HF), calcium homeostasis is disordered, and spontaneous waves may occur. In this study, we developed a mathematical model of formation and propagation of a calcium wave based upon a governing system of diffusion-reaction equations presented by Izu et al (2001 Biophys. J. 80 103-20) and integrated non-clustered or 'rogue' ryanodine receptors (rogue RyRs) into a two-dimensional (2D) model of ventricular myocytes isolated from failing hearts in which sarcoplasmic reticulum (SR) Ca2+ pools are partially unloaded. The model was then used to simulate the effect of rogue RyRs on initiation and propagation of the calcium wave in ventricular myocytes with HF. Our simulation results show that rogue RyRs can amplify the diastolic SR Ca2+ leak in the form of Ca2+ quarks, increase the probability of occurrence of spontaneous Ca2+ waves even with smaller SR Ca2+ stores, accelerate Ca2+ wave propagation, and hence lead to delayed afterdepolarizations (DADs) and cardiac arrhythmia in the diseased heart. This investigation suggests that incorporating rogue RyRs in the Ca2+ wave model under HF conditions provides a new view of Ca2+ dynamics that could not be mimicked by adjusting traditional parameters involved in Ca2+ release units and other ion channels, and contributes to understanding the underlying mechanism of HF.

  1. New Computer Simulations of Macular Neural Functioning

    Science.gov (United States)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  2. Computer simulations for the nano-scale

    International Nuclear Information System (INIS)

    Stich, I.

    2007-01-01

    A review of methods for computations for the nano-scale is presented. The paper should provide a convenient starting point into computations for the nano-scale as well as a more in depth presentation for those already working in the field of atomic/molecular-scale modeling. The argument is divided in chapters covering the methods for description of the (i) electrons, (ii) ions, and (iii) techniques for efficient solving of the underlying equations. A fairly broad view is taken covering the Hartree-Fock approximation, density functional techniques and quantum Monte-Carlo techniques for electrons. The customary quantum chemistry methods, such as post Hartree-Fock techniques, are only briefly mentioned. Description of both classical and quantum ions is presented. The techniques cover Ehrenfest, Born-Oppenheimer, and Car-Parrinello dynamics. The strong and weak points of both principal and technical nature are analyzed. In the second part we introduce a number of applications to demonstrate the different approximations and techniques introduced in the first part. They cover a wide range of applications such as non-simple liquids, surfaces, molecule-surface interactions, applications in nano technology, etc. These more in depth presentations, while certainly not exhaustive, should provide information on technical aspects of the simulations, typical parameters used, and ways of analysis of the huge amounts of data generated in these large-scale supercomputer simulations. (author)

  3. Upgrade of the computer-based information systems on USNRC simulators

    International Nuclear Information System (INIS)

    Griffin, J.I.

    1998-01-01

    In late 1995, the U.S. Nuclear Regulatory Commission (USNRC) began a project to upgrade the computer-based information systems on its BWR/6 and BandW Simulators. The existing display generation hardware was very old and in need of replacement due to difficulty in obtaining spare parts and technical support. In addition, the display systems used currently each require a SEL 32/55 computer system, which is also obsolete, running the Real Time Monitor (RTM) operating system. An upgrade of the display hardware and display generation systems not only solves the problem of obsolescence of that equipment but also allows removal of the 32/55 systems. These computers are used only to support the existing display generation systems. Shortly after purchase of the replacement equipment, it was learned that the vendor was no longer going to support the methodology. Instead of implementing an unsupported concept, it was decided to implement the display systems upgrades using the Picasso-3 UIMS (User Interface Management System) and the purchased hardware. This paper describes the upgraded display systems for the BWR/6 and BandW Simulators, including the design concept, display development, hardware requirements, the simulator interface software, and problems encountered. (author)

  4. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  5. A computer code package for electron transport Monte Carlo simulation

    International Nuclear Information System (INIS)

    Popescu, Lucretiu M.

    1999-01-01

    A computer code package was developed for solving various electron transport problems by Monte Carlo simulation. It is based on condensed history Monte Carlo algorithm. In order to get reliable results over wide ranges of electron energies and target atomic numbers, specific techniques of electron transport were implemented such as: Moliere multiscatter angular distributions, Blunck-Leisegang multiscatter energy distribution, sampling of electron-electron and Bremsstrahlung individual interactions. Path-length and lateral displacement corrections algorithms and the module for computing collision, radiative and total restricted stopping powers and ranges of electrons are also included. Comparisons of simulation results with experimental measurements are finally presented. (author)

  6. Investigations into radiation damages of reactor materials by computer simulation

    International Nuclear Information System (INIS)

    Bronnikov, V.A.

    2004-01-01

    Data on the state of works in European countries in the field of computerized simulation of radiation damages of reactor materials under the context of the international projects ITEM (European Database for Multiscale Modelling) and SIRENA (Simulation of Radiation Effects in Zr-Nb alloys) - computerized simulation of stress corrosion when contact of Zr-Nb alloys with iodine are presented. Computer codes for the simulation of radiation effects in reactor materials were developed. European Database for Multiscale Modelling (EDAM) was organized using the results of the investigations provided in the ITEM project [ru

  7. Two-dimensional FSI simulation of closing dynamics of a tilting disc mechanical heart valve.

    Science.gov (United States)

    Govindarajan, V; Udaykumar, H S; Herbertson, L H; Deutsch, S; Manning, K B; Chandran, K B

    2010-03-01

    The fluid dynamics during valve closure resulting in high shear flows and large residence times of particles has been implicated in platelet activation and thrombus formation in mechanical heart valves. Our previous studies with bi-leaflet valves have shown that large shear stresses induced in the gap between the leaflet edge and the valve housing results in relatively high platelet activation levels whereas flow between the leaflets results in shed vortices not conducive to platelet damage. In this study we compare the result of closing dynamics of a tilting disc valve with that of a bi-leaflet valve. The two-dimensional fluid-structure interaction analysis of a tilting disc valve closure mechanics is performed with a fixed grid Cartesian mesh flow solver with local mesh refinement, and a Lagrangian particle dynamic analysis for computation of potential for platelet activation. Throughout the simulation the flow remains in the laminar regime and the flow through the gap width is marked by the development of a shear layer which separates from the leaflet downstream of the valve. Zones of re-circulation are observed in the gap between the leaflet edge and the valve housing on the major orifice region of the tilting disc valve and are seen to be migrating towards the minor orifice region. Jet flow is observed at the minor orifice region and a vortex is formed which sheds in the direction of fluid motion as observed in experiments using PIV measurements. The activation parameter computed for the tilting disc valve, at the time of closure was found to be 2.7 times greater than that of the bi-leaflet mechanical valve and was found to be in the vicinity of the minor orifice region mainly due to the migration of vortical structures from the major to the minor orifice region during the leaflet rebound of the closing phase.

  8. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  9. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin Nasaruddin

    2013-09-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  10. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin

    2009-11-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  11. Computer simulation of the self-sputtering of uranium

    International Nuclear Information System (INIS)

    Robinson, M.T.

    1983-01-01

    The sputtering of polycrystalline α-uranium by uranium ions of energies below 10 keV has been studied in the binary collision approximation using the computer simulation program marlowe. Satisfactory agreement of the computed sputtering yields with the small amount of available experimental data was achieved using the Moliere interatomic potential, a semilocal inelastic loss function, and a planar surface binding barrier, all with conventional parameters. The model is used to discuss low energy sputtering processes and the energy and angular distributions of the reflected primaries and the sputtered target particles

  12. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    International Nuclear Information System (INIS)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-01-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  13. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  14. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  15. Modeling and simulation the computer science of illusion

    CERN Document Server

    Raczynski, Stanislaw

    2006-01-01

    Simulation is the art of using tools - physical or conceptual models, or computer hardware and software, to attempt to create the illusion of reality. The discipline has in recent years expanded to include the modelling of systems that rely on human factors and therefore possess a large proportion of uncertainty, such as social, economic or commercial systems. These new applications make the discipline of modelling and simulation a field of dynamic growth and new research. Stanislaw Raczynski outlines the considerable and promising research that is being conducted to counter the problems of

  16. Computational simulation of laser heat processing of materials

    Science.gov (United States)

    Shankar, Vijaya; Gnanamuthu, Daniel

    1987-04-01

    A computational model simulating the laser heat treatment of AISI 4140 steel plates with a CW CO2 laser beam has been developed on the basis of the three-dimensional, time-dependent heat equation (subject to the appropriate boundary conditions). The solution method is based on Newton iteration applied to a triple-approximate factorized form of the equation. The method is implicit and time-accurate; the maintenance of time-accuracy in the numerical formulation is noted to be critical for the simulation of finite length workpieces with a finite laser beam dwell time.

  17. Load/resource matching for period-of-record computer simulation

    International Nuclear Information System (INIS)

    Lindsey, E.D. Jr.; Robbins, G.E. III

    1991-01-01

    The Southwestern Power Administration (Southwestern), an agency of the Department of Energy, is responsible for marketing the power and energy produced at Federal hydroelectric power projects developed by the U.S. Army Corps of Engineers in the southwestern United States. This paper reports that in order to maximize benefits from limited resources, to evaluate proposed changes in the operation of existing projects, and to determine the feasibility and marketability of proposed new projects, Southwestern utilizes a period-of-record computer simulation model created in the 1960's. Southwestern is constructing a new computer simulation model to take advantage of changes in computers, policy, and procedures. Within all hydroelectric power reservoir systems, the ability of the resources to match the load demand is critical and presents complex problems. Therefore, the method used to compare available energy resources to energy load demands is a very important aspect of the new model. Southwestern has developed an innovative method which compares a resource duration curve with a load duration curve, adjusting the resource duration curve to make the most efficient use of the available resources

  18. Multislice Spiral Computed Tomography of the Heart: Technique, Current Applications, and Perspective

    International Nuclear Information System (INIS)

    Mahnken, Andreas H.; Wildberger, Joachim E.; Koos, Ralf; Guenther, Rolf W.

    2005-01-01

    Multislice spiral computed tomography (MSCT) is a rapidly evolving, noninvasive technique for cardiac imaging. Knowledge of the principle of electrocardiogram-gated MSCT and its limitations in clinical routine are needed to optimize image quality. Therefore, the basic technical principle including essentials of image postprocessing is described. Cardiac MSCT imaging was initially focused on coronary calcium scoring, MSCT coronary angiography, and analysis of left ventricular function. Recent studies also evaluated the ability of cardiac MSCT to visualize myocardial infarction and assess valvular morphology. In combination with experimental approaches toward the assessment of aortic valve function and myocardial viability, cardiac MSCT holds the potential for a comprehensive examination of the heart using one single examination technique

  19. COMPARATIVE STUDY OF TERTIARY WASTEWATER TREATMENT BY COMPUTER SIMULATION

    Directory of Open Access Journals (Sweden)

    Stefania Iordache

    2010-01-01

    Full Text Available The aim of this work is to asses conditions for implementation of a Biological Nutrient Removal (BNR process in theWastewater Treatment Plant (WWTP of Moreni city (Romania. In order to meet the more increased environmentalregulations, the wastewater treatment plant that was studied, must update the actual treatment process and have tomodernize it. A comparative study was undertaken of the quality of effluents that could be obtained by implementationof biological nutrient removal process like A2/O (Anaerobic/Anoxic/Oxic and VIP (Virginia Plant Initiative aswastewater tertiary treatments. In order to asses the efficiency of the proposed treatment schemata based on the datamonitored at the studied WWTP, it were realized computer models of biological nutrient removal configurations basedon A2/O and VIP process. Computer simulation was realized using a well-known simulator, BioWin by EnviroSimAssociates Ltd. The simulation process allowed to obtain some data that can be used in design of a tertiary treatmentstage at Moreni WWTP, in order to increase the efficiency in operation.

  20. Computational Simulation on Electrowinning for Used LiCl-KCl salts

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, Sung June; Kim, Pyeong Hwa; Hwang, Il Soon [KAERI, Daejeon (Korea, Republic of); Park, Jae Yeong [Korea Institute of Nuclear Safety, Daejoen (Korea, Republic of)

    2016-05-15

    That purification is consisted of electrowinning with liquid metal cathode and selective oxidation with chemical equilibrium by using metal chloride as an oxidizing agent. Actinides and rare earth elements are deposited to liquid cathode in electrowinning and rare earth elements are selectively extracted to molten salt, however, code posited Li react to oxidizing agent prior to rare earth elements which are intended to react in selective oxidation. Also if termination point of actinides deposition in electrowinning is clearly known, we would decrease amount of reacting rare earth elements as well as Li and throughput could be enhanced. For pyroprocess research computational simulation is important to save limited resources and research environment. This study shows computational modeling on electrowinning with Bi cathode by using electrochemical simulation code REFIN. This study shows that it is possible to simulate electrochemical behaviors of at least seven elements (excluding electrode and electrolyte materials) according to real time. In order to enhance accuracy of simulation results, it is suggested that combination of REFIN and CFD modeling on two immiscible liquid to calculate diffusion boundary layer thickness as well.

  1. Proceedings of joint meeting of the 6th simulation science symposium and the NIFS collaboration research 'large scale computer simulation'

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-03-01

    Joint meeting of the 6th Simulation Science Symposium and the NIFS Collaboration Research 'Large Scale Computer Simulation' was held on December 12-13, 2002 at National Institute for Fusion Science, with the aim of promoting interdisciplinary collaborations in various fields of computer simulations. The present meeting attended by more than 40 people consists of the 11 invited and 22 contributed papers, of which topics were extended not only to fusion science but also to related fields such as astrophysics, earth science, fluid dynamics, molecular dynamics, computer science etc. (author)

  2. Interactive virtual simulation using a 3D computer graphics model for microvascular decompression surgery.

    Science.gov (United States)

    Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko

    2012-09-01

    The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.

  3. Cardiorespiratory endurance evaluation using heart rate analysis during ski simulator exercise and the Harvard step test in elementary school students.

    Science.gov (United States)

    Lee, Hyo Taek; Roh, Hyo Lyun; Kim, Yoon Sang

    2016-01-01

    [Purpose] Efficient management using exercise programs with various benefits should be provided by educational institutions for children in their growth phase. We analyzed the heart rates of children during ski simulator exercise and the Harvard step test to evaluate the cardiopulmonary endurance by calculating their post-exercise recovery rate. [Subjects and Methods] The subjects (n = 77) were categorized into a normal weight and an overweight/obesity group by body mass index. They performed each exercise for 3 minutes. The cardiorespiratory endurance was calculated using the Physical Efficiency Index formula. [Results] The ski simulator and Harvard step test showed that there was a significant difference in the heart rates of the 2 body mass index-based groups at each minute. The normal weight and the ski-simulator group had higher Physical Efficiency Index levels. [Conclusion] This study showed that a simulator exercise can produce a cumulative load even when performed at low intensity, and can be effectively utilized as exercise equipment since it resulted in higher Physical Efficiency Index levels than the Harvard step test. If schools can increase sport durability by stimulating students' interests, the ski simulator exercise can be used in programs designed to improve and strengthen students' physical fitness.

  4. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  5. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms.

    Science.gov (United States)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  6. A computer simulation of the argument from disagreement

    NARCIS (Netherlands)

    Gustafsson, J.E.; Peterson, M.B.

    2012-01-01

    In this paper we shed new light on the Argument from Disagreement by putting it to test in a computer simulation. According to this argument widespread and persistent disagreement on ethical issues indicates that our moral opinions are not influenced by any moral facts, either because no such facts

  7. Cone beam tomography of the heart using single-photon emission-computed tomography

    International Nuclear Information System (INIS)

    Gullberg, G.T.; Christian, P.E.; Zeng, G.L.; Datz, F.L.; Morgan, H.T.

    1991-01-01

    The authors evaluated cone beam single-photon emission-computed tomography (SPECT) of the heart. A new cone beam reconstruction algorithm was used to reconstruct data collected from short scan acquisitions (of slightly more than 180 degrees) of a detector anteriorally traversing a noncircular orbit. The less than 360 degrees acquisition was used to minimize the attenuation artifacts that result from reconstructing posterior projections of 201T1 emissions from the heart. The algorithm includes a new method for reconstructing truncated projections of background tissue activity that eliminates reconstruction ring artifacts. Phantom and patient results are presented which compare a high-resolution cone beam collimator (50-cm focal length; 6.0-mm full width at half maximum [FWHM] at 10 cm) to a low-energy general purpose (LEGP) parallel hole collimator (8.2-mm FWHM at 10 cm) which is 1.33 times more sensitive. The cone beam tomographic results are free of reconstruction artifacts and show improved spatial and contrast resolution over that obtained with the LEGP parallel hole collimator. The limited angular sampling restrictions and truncation problems associated with cone beam tomography do not deter from obtaining diagnostic information. However, even though these preliminary results are encouraging, a thorough clinical study is still needed to investigate the specificity and sensitivity of cone beam tomography

  8. Scientific and computational challenges of the fusion simulation project (FSP)

    International Nuclear Information System (INIS)

    Tang, W M

    2008-01-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics

  9. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    Energy Technology Data Exchange (ETDEWEB)

    Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)

    2014-06-15

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and

  10. MO-G-17A-04: Internal Dosimetric Calculations for Pediatric Nuclear Imaging Applications, Using Monte Carlo Simulations and High-Resolution Pediatric Computational Models

    International Nuclear Information System (INIS)

    Papadimitroulas, P; Kagadis, GC; Loudos, G

    2014-01-01

    Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10 10 and 0.15*10 10 respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and evaluating the

  11. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  12. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Phillips, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wampler, Cheryl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meisner, Robert [National Nuclear Security Administration (NNSA), Washington, DC (United States)

    2010-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality, and scientific details); to quantify critical margins and uncertainties; and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  13. The transesophageal echocardiography simulator based on computed tomography images.

    Science.gov (United States)

    Piórkowski, Adam; Kempny, Aleksander

    2013-02-01

    Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.

  14. Assessment of coronary arteries in infants by 64-detector-row multislice spiral computed tomography

    International Nuclear Information System (INIS)

    Tahara, Masahiro; Waki, Chiaki; Komatsu, Hiroaki; Hayashi, Tomohiro; Sato, Tomoyasu

    2008-01-01

    Heart rate is one of the most important factors for optimal visualization of cardiac CT. We investigated the relation between heart rate and visibility of the coronary arteries with 64-detector row multislice spiral computed tomography (MSCT). Three simulated coronary artery stenosis models (3, 4, and 5 mm) were attached to a moving heart phantom and scanned using 64-detector row MSCT. The heart rate of the phantom was varied between 60 and 150 beats per minutes (bpm). The visibility of simulated coronary arteries was assessed in comparison between cardiac half reconstruction (CHR) and multi-sector reconstruction (MSR). Then contrast-enhanced 64-detector row MSCT was performed in 16 patients under 3 years of age with congenital heart disease and Kawasaki disease without heart rate control. The visibility of coronary artery segments was graded on a three-point scale. The simulated coronary artery patency was detected in the moving phantom at maximum heart rate (150 bpm) with MSR. Minimum lumen diameter was 0.75 mm. Electrocardiogram (ECG)-gated cardiac CT was performed in 9 patients, and non-ECG-gated cardiac CT was performed in 7 patients. The origin and proximal course of coronary arteries in all 9 patients with ECG-gated acquisition were visually evaluated. 64-detector row MSCT with ECG-gated acquisition is able to visualize the origin and proximal course of coronary arteries in infants under 3 years of age. (author)

  15. CMR reference values for left ventricular volumes, mass, and ejection fraction using computer-aided analysis : The Framingham Heart Study

    NARCIS (Netherlands)

    Chuang, Michael L.; Gona, Philimon; Hautvast, Gilion L.T.F.; Salton, Carol J.; Breeuwer, Marcel; O'Donnell, Christopher J.; Manning, Warren J.

    Purpose To determine sex-specific reference values for left ventricular (LV) volumes, mass, and ejection fraction (EF) in healthy adults using computer-aided analysis and to examine the effect of age on LV parameters. Materials and Methods We examined data from 1494 members of the Framingham Heart

  16. COMPUTER SIMULATION IN MECHANICS TEACHING AND LEARNING: A CASE STUDY ON STUDENTS’ UNDERSTANDING OF FORCE AND MOTION

    Directory of Open Access Journals (Sweden)

    Dyah Permata Sari

    2015-12-01

    Full Text Available The objective of this research was to develop a force and motion simulation based on the open-source Easy Java Simulation. The process of computer simulation development was done following the ADDIE model. Based on the Analysis and Design phases, the Development phase used the open-source Easy Java Simulation (EJS to develop a computer simulation with physics content that was relevant to the subtopic. Computing and communication technology continue to make an increasing impact on all aspects of education. EJS is a powerful didactic resource that gives us the ability to focus our students’ attention on the principles of physics. Using EJS, a computer simulation was created through which the motion of a particle under the action of a specific force can be studied. The implementation phase is implemented the computer simulation in the teaching and learning process. To describe the improvements in the students’ understanding of the force and motion concepts, we used a t-test to evaluate each of the four phases. These results indicated that the use of the computer simulation could improve students’ force and motion conceptual competence regarding Newton's second law of motion.

  17. Six-degree-of-freedom missile simulation using the ADI AD 100 digital computer and ADSIM simulation language

    Science.gov (United States)

    Zwaanenburg, Koos

    1989-01-01

    The use of an AD 100 computer and the ADSIM language in the six-degree-of-freedom digital simulation of an air-to-ground missile is illustrated. The missile is launched from a moving platform, typically a helicopter, and is capable of striking a mobile target up to 10 kilometers away. The missile could be any tactical missile. The performance numbers of the AD 100 show that it is possible to implement a high performance missile model in a real-time simulation without the problems associated with an implementation on a general purpose computer using FORTRAN.

  18. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  19. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  20. Computer simulation of strain-induced ordering in interstitial solutions based on the b.c.c. Ta lattice

    International Nuclear Information System (INIS)

    Blanter, M.S.; Khachaturyan, A.G.

    1980-01-01

    A computer simulation is made of strain-induced ordering of interstitial atoms within octahedral interstices in the Ta host lattice. The calculation technique allows to take into account infinite-range strain-induced interaction. Computer simulation of ordering process enables to model the sequence of structure changes which occur during the ordering process and to find the equilibrium structure of the stable interstitial superstructures. The structures of high-temperature ordering phases obtained by the method of static concentration waves coincide with those obtained by means of computer simulation. However computer simulation enables to predict the structures of low-temperature ordered phases which cannot be obtained by the method of concentration waves. Comparison of computer simulation results and structures of observed ordered phases demonstrates good agreement. (author)

  1. The acoustical history of Hagia Sophia revived through computer simulations

    DEFF Research Database (Denmark)

    Rindel, Jens Holger; Weitze, C.A.; Christensen, Claus Lynge

    2002-01-01

    The present paper deals with acoustic computer simulations of Hagia Sophia, which is characterized not only by being one of the largest worship buildings in the world, but also by – in its 1500 year history – having served three purposes: as a church, as a mosque and today as a museum. The invest......The present paper deals with acoustic computer simulations of Hagia Sophia, which is characterized not only by being one of the largest worship buildings in the world, but also by – in its 1500 year history – having served three purposes: as a church, as a mosque and today as a museum....... The investigation is done as a part of the EU project - CAHRISMA....

  2. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  3. Computational Enhancements for Direct Numerical Simulations of Statistically Stationary Turbulent Premixed Flames

    KAUST Repository

    Mukhadiyev, Nurzhan

    2017-05-01

    Combustion at extreme conditions, such as a turbulent flame at high Karlovitz and Reynolds numbers, is still a vast and an uncertain field for researchers. Direct numerical simulation of a turbulent flame is a superior tool to unravel detailed information that is not accessible to most sophisticated state-of-the-art experiments. However, the computational cost of such simulations remains a challenge even for modern supercomputers, as the physical size, the level of turbulence intensity, and chemical complexities of the problems continue to increase. As a result, there is a strong demand for computational cost reduction methods as well as in acceleration of existing methods. The main scope of this work was the development of computational and numerical tools for high-fidelity direct numerical simulations of premixed planar flames interacting with turbulence. The first part of this work was KAUST Adaptive Reacting Flow Solver (KARFS) development. KARFS is a high order compressible reacting flow solver using detailed chemical kinetics mechanism; it is capable to run on various types of heterogeneous computational architectures. In this work, it was shown that KARFS is capable of running efficiently on both CPU and GPU. The second part of this work was numerical tools for direct numerical simulations of planar premixed flames: such as linear turbulence forcing and dynamic inlet control. DNS of premixed turbulent flames conducted previously injected velocity fluctuations at an inlet. Turbulence injected at the inlet decayed significantly while reaching the flame, which created a necessity to inject higher than needed fluctuations. A solution for this issue was to maintain turbulence strength on the way to the flame using turbulence forcing. Therefore, a linear turbulence forcing was implemented into KARFS to enhance turbulence intensity. Linear turbulence forcing developed previously by other groups was corrected with net added momentum removal mechanism to prevent mean

  4. Computer simulation of fatigue under diametrical compression

    International Nuclear Information System (INIS)

    Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

    2007-01-01

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings

  5. Computer Simulation of the UMER Gridded Gun

    CERN Document Server

    Haber, Irving; Friedman, Alex; Grote, D P; Kishek, Rami A; Reiser, Martin; Vay, Jean-Luc; Zou, Yun

    2005-01-01

    The electron source in the University of Maryland Electron Ring (UMER) injector employs a grid 0.15 mm from the cathode to control the current waveform. Under nominal operating conditions, the grid voltage during the current pulse is sufficiently positive relative to the cathode potential to form a virtual cathode downstream of the grid. Three-dimensional computer simulations have been performed that use the mesh refinement capability of the WARP particle-in-cell code to examine a small region near the beam center in order to illustrate some of the complexity that can result from such a gridded structure. These simulations have been found to reproduce the hollowed velocity space that is observed experimentally. The simulations also predict a complicated time-dependent response to the waveform applied to the grid during the current turn-on. This complex temporal behavior appears to result directly from the dynamics of the virtual cathode formation and may therefore be representative of the expected behavior in...

  6. In-cylinder diesel spray combustion simulations using parallel computation: A performance benchmarking study

    International Nuclear Information System (INIS)

    Pang, Kar Mun; Ng, Hoon Kiat; Gan, Suyin

    2012-01-01

    Highlights: ► A performance benchmarking exercise is conducted for diesel combustion simulations. ► The reduced chemical mechanism shows its advantages over base and skeletal models. ► High efficiency and great reduction of CPU runtime are achieved through 4-node solver. ► Increasing ISAT memory from 0.1 to 2 GB reduces the CPU runtime by almost 35%. ► Combustion and soot processes are predicted well with minimal computational cost. - Abstract: In the present study, in-cylinder diesel combustion simulation was performed with parallel processing on an Intel Xeon Quad-Core platform to allow both fluid dynamics and chemical kinetics of the surrogate diesel fuel model to be solved simultaneously on multiple processors. Here, Cartesian Z-Coordinate was selected as the most appropriate partitioning algorithm since it computationally bisects the domain such that the dynamic load associated with fuel particle tracking was evenly distributed during parallel computations. Other variables examined included number of compute nodes, chemistry sizes and in situ adaptive tabulation (ISAT) parameters. Based on the performance benchmarking test conducted, parallel configuration of 4-compute node was found to reduce the computational runtime most efficiently whereby a parallel efficiency of up to 75.4% was achieved. The simulation results also indicated that accuracy level was insensitive to the number of partitions or the partitioning algorithms. The effect of reducing the number of species on computational runtime was observed to be more significant than reducing the number of reactions. Besides, the study showed that an increase in the ISAT maximum storage of up to 2 GB reduced the computational runtime by 50%. Also, the ISAT error tolerance of 10 −3 was chosen to strike a balance between results accuracy and computational runtime. The optimised parameters in parallel processing and ISAT, as well as the use of the in-house reduced chemistry model allowed accurate

  7. The Simulation of an Oxidation-Reduction Titration Curve with Computer Algebra

    Science.gov (United States)

    Whiteley, Richard V., Jr.

    2015-01-01

    Although the simulation of an oxidation/reduction titration curve is an important exercise in an undergraduate course in quantitative analysis, that exercise is frequently simplified to accommodate computational limitations. With the use of readily available computer algebra systems, however, such curves for complicated systems can be generated…

  8. Computer simulation of laboratory leaching and washing of tank waste sludges

    International Nuclear Information System (INIS)

    Meng, C.D.; MacLean, G.T.; Landeene, B.C.

    1994-01-01

    The process simulator ESP (Environmental Simulation Program) was used to simulate laboratory caustic leaching and washing of core samples from Tanks B-110, C-109, and C-112. The results of the laboratory tests and the computer simulations are compared. The results from both, agreed reasonably well for elements contained in solid phases included in the ESP Public data bank. The use of the GEOCHEM data bank and/or a custom Hanford Data bank should improve the agreement, making ESP a useful process simulator for aqueous based processing

  9. Solving wood chip transport problems with computer simulation.

    Science.gov (United States)

    Dennis P. Bradley; Sharon A. Winsauer

    1976-01-01

    Efficient chip transport operations are difficult to achieve due to frequent and often unpredictable changes in distance to market, chipping rate, time spent at the mill, and equipment costs. This paper describes a computer simulation model that allows a logger to design an efficient transport system in response to these changing factors.

  10. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  11. Computational simulation of the creep-rupture process in filamentary composite materials

    Science.gov (United States)

    Slattery, Kerry T.; Hackett, Robert M.

    1991-01-01

    A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.

  12. Post-processing computational fluid dynamic simulations of gas turbine combustor

    International Nuclear Information System (INIS)

    Sturgess, G.J.; Inko-Tariah, W.P.C.; James, R.H.

    1986-01-01

    The flowfield in combustors for gas turbine engines is extremely complex. Numerical simulation of such flowfields using computational fluid dynamics techniques has much to offer the design and development engineer. It is a difficult task, but it is one which is now being attempted routinely in the industry. The results of such simulations yield enormous amounts of information from which the responsible engineer has to synthesize a comprehensive understanding of the complete flowfield and the processes contained therein. The complex picture so constructed must be distilled down to the essential information upon which rational development decisions can be made. The only way this can be accomplished successfully is by extensive post-processing of the calculation. Post processing of a simulation relies heavily on computer graphics, and requires the enhancement provided by color. The application of one such post-processor is presented, and the strengths and weaknesses of various display techniques are illustrated

  13. Quantum simulation of superconductors on quantum computers. Toward the first applications of quantum processors

    Energy Technology Data Exchange (ETDEWEB)

    Dallaire-Demers, Pierre-Luc

    2016-10-07

    Quantum computers are the ideal platform for quantum simulations. Given enough coherent operations and qubits, such machines can be leveraged to simulate strongly correlated materials, where intricate quantum effects give rise to counter-intuitive macroscopic phenomena such as high-temperature superconductivity. Many phenomena of strongly correlated materials are encapsulated in the Fermi-Hubbard model. In general, no closed-form solution is known for lattices of more than one spatial dimension, but they can be numerically approximated using cluster methods. To model long-range effects such as order parameters, a powerful method to compute the cluster's Green's function consists in finding its self-energy through a variational principle. As is shown in this thesis, this allows the possibility of studying various phase transitions at finite temperature in the Fermi-Hubbard model. However, a classical cluster solver quickly hits an exponential wall in the memory (or computation time) required to store the computation variables. We show theoretically that the cluster solver can be mapped to a subroutine on a quantum computer whose quantum memory usage scales linearly with the number of orbitals in the simulated cluster and the number of measurements scales quadratically. We also provide a gate decomposition of the cluster Hamiltonian and a simple planar architecture for a quantum simulator that can also be used to simulate more general fermionic systems. We briefly analyze the Trotter-Suzuki errors and estimate the scaling properties of the algorithm for more complex applications. A quantum computer with a few tens of qubits could therefore simulate the thermodynamic properties of complex fermionic lattices inaccessible to classical supercomputers.

  14. Quantum simulation of superconductors on quantum computers. Toward the first applications of quantum processors

    International Nuclear Information System (INIS)

    Dallaire-Demers, Pierre-Luc

    2016-01-01

    Quantum computers are the ideal platform for quantum simulations. Given enough coherent operations and qubits, such machines can be leveraged to simulate strongly correlated materials, where intricate quantum effects give rise to counter-intuitive macroscopic phenomena such as high-temperature superconductivity. Many phenomena of strongly correlated materials are encapsulated in the Fermi-Hubbard model. In general, no closed-form solution is known for lattices of more than one spatial dimension, but they can be numerically approximated using cluster methods. To model long-range effects such as order parameters, a powerful method to compute the cluster's Green's function consists in finding its self-energy through a variational principle. As is shown in this thesis, this allows the possibility of studying various phase transitions at finite temperature in the Fermi-Hubbard model. However, a classical cluster solver quickly hits an exponential wall in the memory (or computation time) required to store the computation variables. We show theoretically that the cluster solver can be mapped to a subroutine on a quantum computer whose quantum memory usage scales linearly with the number of orbitals in the simulated cluster and the number of measurements scales quadratically. We also provide a gate decomposition of the cluster Hamiltonian and a simple planar architecture for a quantum simulator that can also be used to simulate more general fermionic systems. We briefly analyze the Trotter-Suzuki errors and estimate the scaling properties of the algorithm for more complex applications. A quantum computer with a few tens of qubits could therefore simulate the thermodynamic properties of complex fermionic lattices inaccessible to classical supercomputers.

  15. Eighteenth Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics

    CERN Document Server

    Landau, David P; Schüttler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVIII

    2006-01-01

    This volume represents a "status report" emanating from presentations made during the 18th Annual Workshop on Computer Simulations Studies in Condensed Matter Physics at the Center for Simulational Physics at the University of Georgia in March 2005. It provides a broad overview of the most recent advances in the field, spanning the range from statistical physics to soft condensed matter and biological systems. Results on nanostructures and materials are included as are several descriptions of advances in quantum simulations and quantum computing as well as.methodological advances.

  16. Study on motion artifacts in coronary arteries with an anthropomorphic moving heart phantom on an ECG-gated multidetector computed tomography unit

    International Nuclear Information System (INIS)

    Greuter, Marcel J.W.; Dorgelo, Joost; Tukker, Wim G.J.; Oudkerk, Matthijs

    2005-01-01

    Acquisition time plays a key role in the quality of cardiac multidetector computed tomography (MDCT) and is directly related to the rotation time of the scanner. The purpose of this study is to examine the influence of heart rate and a multisector reconstruction algorithm on the image quality of coronary arteries of an anthropomorphic adjustable moving heart phantom on an ECG-gated MDCT unit. The heart phantom and a coronary artery phantom were used on a MDCT unit with a rotation time of 500 ms. The movement of the heart was determined by analysis of the images taken at different phases. The results indicate that the movement of the coronary arteries on the heart phantom is comparable to that in a clinical setting. The influence of the heart rate on image quality and artifacts was determined by analysis of several heart rates between 40 and 80 bpm where the movement of the heart was synchronized using a retrospective ECG-gated acquisition protocol. The resulting reformatted volume rendering images of the moving heart and the coronary arteries were qualitatively compared as a result of the heart rate. The evaluation was performed on three independent series by two independent radiologists for the image quality of the coronary arteries and the presence of artifacts. The evaluation shows that at heart rates above 50 bpm the influence of motion artifacts in the coronary arteries becomes apparent. In addition the influence of a dedicated multisector reconstruction technique on image quality was determined. The results show that the image quality of the coronary arteries is not only related to the heart rate and that the influence of the multisector reconstruction technique becomes significant above 70 bpm. Therefore, this study proves that from the actual acquisition time per heart cycle one cannot determine an actual acquisition time, but only a mathematical acquisition time. (orig.)

  17. Computational physics simulation of classical and quantum systems

    CERN Document Server

    Scherer, Philipp O J

    2013-01-01

    This textbook presents basic and advanced computational physics in a very didactic style. It contains very-well-presented and simple mathematical descriptions of many of the most important algorithms used in computational physics. Many clear mathematical descriptions of important techniques in computational physics are given. The first part of the book discusses the basic numerical methods. A large number of exercises and computer experiments allows to study the properties of these methods. The second part concentrates on simulation of classical and quantum systems. It uses a rather general concept for the equation of motion which can be applied to ordinary and partial differential equations. Several classes of integration methods are discussed including not only the standard Euler and Runge Kutta method but also multistep methods and the class of Verlet methods which is introduced by studying the motion in Liouville space. Besides the classical methods, inverse interpolation is discussed, together with the p...

  18. Computer simulations of disordering kinetics in irradiated intermetallic compounds

    International Nuclear Information System (INIS)

    Spaczer, M.; Caro, A.; Victoria, M.; Diaz de la Rubia, T.

    1994-01-01

    Molecular-dynamics computer simulations of collision cascades in intermetallic Cu 3 Au, Ni 3 Al, and NiAl have been performed to study the nature of the disordering processes in the collision cascade. The choice of these systems was suggested by the quite accurate description of the thermodynamic properties obtained using embedded-atom-type potentials. Since melting occurs in the core of the cascades, interesting effects appear as a result of the superposition of the loss (and subsequent recovery) of the crystalline order and the evolution of the chemical order, both processes being developed on different time scales. In our previous simulations on Ni 3 Al and Cu 3 Au [T. Diaz de la Rubia, A. Caro, and M. Spaczer, Phys. Rev. B 47, 11 483 (1993)] we found a significant difference between the time evolution of the chemical short-range order (SRO) and the crystalline order in the cascade core for both alloys, namely the complete loss of the crystalline structure but only partial chemical disordering. Recent computer simulations in NiAl show the same phenomena. To understand these features we study the liquid phase of these three alloys and present simulation results concerning the dynamical melting of small samples, examining the atomic mobility, the relaxation time, and the saturation value of the chemical short-range order. An analytic model for the time evolution of the SRO is given

  19. COMPUTER SIMULATION THE MECHANICAL MOVEMENT BODY BY MEANS OF MATHCAD

    Directory of Open Access Journals (Sweden)

    Leonid Flehantov

    2017-03-01

    Full Text Available Here considered the technique of using computer mathematics system MathCAD for computer implementation of mathematical model of the mechanical motion of the physical body thrown at an angle to the horizon, and its use for educational computer simulation experiment in teaching the fundamentals of mathematical modeling. The advantages of MathCAD as environment of implementation mathematical models in the second stage of higher education are noted. It describes the creation the computer simulation model that allows you to comprehensively analyze the process of mechanical movement of the body, changing the input parameters of the model: the acceleration of gravity, the initial and final position of the body, the initial velocity and angle, the geometric dimensions of the body and goals. The technique aimed at the effective assimilation of basic knowledge and skills of students on the basics of mathematical modeling, it provides an opportunity to better master the basic theoretical principles of mathematical modeling and related disciplines, promotes logical thinking development of students, their motivation to learn discipline, improves cognitive interest, forms skills research activities than creating conditions for the effective formation of professional competence of future specialists.

  20. Development of a Computer Simulation Game Using a Reverse Engineering Approach

    Science.gov (United States)

    Ozkul, Ahmet

    2012-01-01

    Business simulation games are widely used in the classroom to provide students with experiential learning opportunities on business situations in a dynamic fashion. When properly designed and implemented, the computer simulation game can be a useful educational tool by integrating separate theoretical concepts and demonstrating the nature of…