WorldWideScience

Sample records for fluka based simulation

  1. Hadron production simulation by FLUKA

    CERN Document Server

    Battistoni, G; Ferrari, A; Ranft, J; Roesler, S; Sala, P R

    2013-01-01

    For the purposes of accelerator based neutrino experiments, the simulation of parent hadron production plays a key role. In this paper a quick overview of the main ingredients of the PEANUT event generator implemented in the FLUKA Monte Carlo code is given, together with some benchmarking examples.

  2. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  3. An investigation of the neutron flux in bone-fluorine phantoms comparing accelerator based in vivo neutron activation analysis and FLUKA simulation data

    Energy Technology Data Exchange (ETDEWEB)

    Mostafaei, F.; McNeill, F.E.; Chettle, D.R.; Matysiak, W.; Bhatia, C.; Prestwich, W.V.

    2015-01-01

    We have tested the Monte Carlo code FLUKA for its ability to assist in the development of a better system for the in vivo measurement of fluorine. We used it to create a neutron flux map of the inside of the in vivo neutron activation analysis irradiation cavity at the McMaster Accelerator Laboratory. The cavity is used in a system that has been developed for assessment of fluorine levels in the human hand. This study was undertaken to (i) assess the FLUKA code, (ii) find the optimal hand position inside the cavity and assess the effects on precision of a hand being in a non-optimal position and (iii) to determine the best location for our γ-ray detection system within the accelerator beam hall. Simulation estimates were performed using FLUKA. Experimental measurements of the neutron flux were performed using Mn wires. The activation of the wires was measured inside (1) an empty bottle, (2) a bottle containing water, (3) a bottle covered with cadmium and (4) a dry powder-based fluorine phantom. FLUKA was used to simulate the irradiation cavity, and used to estimate the neutron flux in different positions both inside, and external to, the cavity. The experimental results were found to be consistent with the Monte Carlo simulated neutron flux. Both experiment and simulation showed that there is an optimal position in the cavity, but that the effect on the thermal flux of a hand being in a non-optimal position is less than 20%, which will result in a less than 10% effect on the measurement precision. FLUKA appears to be a code that can be useful for modeling of this type of experimental system.

  4. Simulation of the response functions of an extended range neutron multisphere spectrometer using FLUKA

    Science.gov (United States)

    Wang, Pan-Feng; Ding, Ya-Dong; Wang, Qing-Bin; Ma, Zhong-Jian; Guo, Si-Ming; Li, Guan-Jia

    2015-07-01

    In this paper, the distribution of radiation field in the CSNS spectrometer hall at Dongguan, China, was simulated by the FLUKA program. The results show that the radiation field of the high energy proton accelerator is dominated by neutron radiation, with a broad range of neutron energies, spanning about eleven orders of magnitude. Simulation and calculation of the response functions of four Bonner spheres with a simplified model is done with FLUKA and MCNPX codes respectively, proving the feasibility of the FLUKA program for this application and the correctness of the calculation method. Using the actual model, we simulate and calculate the energy response functions of Bonner sphere detectors with polyethylene layers of different diameters, including detectors with lead layers, using the FLUKA code. Based on the simulation results, we select eleven detectors as the basic structure for an Extended Range Neutron Multisphere Spectrometer (ERNMS).

  5. Simulation of the response functions of an extended range neutron multisphere spectrometer using FLUKA

    Institute of Scientific and Technical Information of China (English)

    WANG Pan-Feng; DING Ya-Dong; WANG Qing-Bin; MA Zhong-Jian; GUO Si-Ming; LI Guan-Jia

    2015-01-01

    In this paper,the distribution of radiation field in the CSNS spectrometer hall at Dongguan,China,was simulated by the FLUKA program.The results show that the radiation field of the high energy proton accelerator is dominated by neutron radiation,with a broad range of neutron energies,spanning about eleven orders of magnitude.Simulation and calculation of the response functions of four Bonner spheres with a simplified model is done with FLUKA and MCNPX codes respectively,proving the feasibility of the FLUKA program for this application and the correctness of the calculation method.Using the actual model,we simulate and calculate the energy response functions of Bonner sphere detectors with polyethylene layers of different diameters,including detectors with lead layers,using the FLUKA code.Based on the simulation results,we select eleven detectors as the basic structure for an Extended Range Neutron Multisphere Spectrometer (ERNMS).

  6. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  7. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    Science.gov (United States)

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features.

  8. The FLUKA code: An accurate simulation tool for particle therapy

    CERN Document Server

    Battistoni, Giuseppe; Böhlen, Till T; Cerutti, Francesco; Chin, Mary Pik Wai; Dos Santos Augusto, Ricardo M; Ferrari, Alfredo; Garcia Ortega, Pablo; Kozlowska, Wioletta S; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically-based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in-vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with bot...

  9. FLUKA simulations of neutron transport in the Dresden Felsenkeller

    Energy Technology Data Exchange (ETDEWEB)

    Grieger, Marcel [Helmholtz-Zentrum Dresden-Rossendorf (HZDR), Dresden (Germany); Technische Universitaet Dresden (Germany); Bemmerer, Daniel; Mueller, Stefan E.; Szuecs, Tamas [Helmholtz-Zentrum Dresden-Rossendorf (HZDR), Dresden (Germany); Zuber, Kai [Technische Universitaet Dresden (Germany)

    2015-07-01

    A new underground ion accelerator with 5 MV acceleration potential is currently being prepared for installation in the Dresden Felsenkeller. The Felsenkeller site consists of altogether nine mutually connected tunnels. It is shielded from cosmic radiation by a 45 m thick rock overburden, enabling uniquely sensitive experiments. In order to exclude any possible effect by the new accelerator in tunnel VIII on the existing low-background γ-counting facility in tunnel IV, Monte Carlo simulations of neutron transport are being performed. A realistic neutron source field is developed, and the resulting additional neutron flux at the γ-counting facility is modeled by FLUKA simulations.

  10. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)

    2016-01-15

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation

  11. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2014-06-01

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health

  12. FLUKA simulations for the optimization of the Beam Loss Monitors

    CERN Document Server

    Brugger, M; Ferrari, A; Magistris, M; Santana-Leitner, M; Vlachoudis, V; CERN. Geneva. AB Department

    2006-01-01

    The collimation system in the beam cleaning insertion IR7 of the Large Hadron Collider (LHC) is expected to clean the primary halo and the secondary radiation of a beam with unprecedented energy and intensity. Accidental beam losses can therefore entail severe consequences to the hardware of the machine. Thus, protection mechanisms, e.g. beam abort, must be instantaneously triggered by a set of Beam Loss Monitors (BLM's). The readings in the BLM's couple the losses from various collimators, thus rendering the identification of any faulty unit rather complex. In the present study the detailed geometry of IR7 is upgraded with the insertion of the BLM's, and the Monte Carlo FLUKA transport code is used to estimate the individual contribution of every collimator to the showers detected in each BLM.

  13. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    Science.gov (United States)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  14. Simulation of Heavy-Ion Beam Losses with the SixTrack-FLUKA Active Coupling

    CERN Document Server

    Hermes, Pascal; Cerutti, Francesco; Ferrari, Alfredo; Jowett, John; Lechner, Anton; Mereghetti, Alessio; Mirarchi, Daniele; Ortega, Pablo; Redaelli, Stefano; Salvachua, Belen; Skordis, Eleftherios; Valentino, Gianluca; Vlachoudis, Vasilis

    2016-01-01

    The LHC heavy-ion program aims to further increase the stored ion beam energy, putting high demands on the LHC collimation system. Accurate simulations of the ion collimation efficiency are crucial to validate the feasibility of new proposed configurations and beam parameters. In this paper we present a generalized framework of the SixTrack-FLUKA coupling to simulate the fragmentation of heavy-ions in the collimators and their motion in the LHC lattice. We compare heavy-ion loss maps simulated on the basis of this framework with the loss distributions measured during heavy-ion operation in 2011 and 2015.

  15. FLUKA Monte Carlo Simulations about Cosmic Rays Interactions with Kaidun Meteorite

    Directory of Open Access Journals (Sweden)

    Turgay Korkut

    2013-01-01

    Full Text Available An asteroid called Kaidun fell on December 3, 1980, in Yemen (15° 0′N, 48° 18′E. Investigations on this large-sized meteorite are ongoing today. In this paper, interactions between cosmic rays-earth atmosphere and cosmic rays-Kaidun meteorite were modeled using a cosmic ray generator FLUKA Monte Carlo code. Isotope distributions and produced particles were given after these interactions. Also, simulation results were compared for these two types of interactions.

  16. FLUKA simulations and measurements for a dump for a 250 GeV/c hadron beam

    CERN Document Server

    Agosteo, S; Para, A; Silari, Marco; Ulrici, L

    2001-01-01

    FLUKA is a Monte Carlo code, transporting hadron and lepton cascades from several TeV down to a few keV (thermal energies for neutrons). The code is widely employed in various applications, such as particle detector design, shielding, radiation therapy, high energy physics experiments. The FLUKA results were compared with experimental data of dose equivalent and spectral fluence of neutrons produced by a 250 GeV/c proton/pion beam impinging on a beam dump installed in one of the secondary beam lines of the CERN super proton synchrotron (SPS). The dump is a shielding structure made of iron/concrete, designed to absorb completely the high-energy beam. The actual geometry of the clump was modeled in the simulations and the scoring of neutron track length was performed at various locations around it. Importance sampling and Russian Roulette mere used as variance reduction techniques. The simulations results were compared with experimental measurements performed with a Bonner sphere system for neutron spectrometry...

  17. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    Science.gov (United States)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the

  18. FLUKA Simulation of Particle Fluences to ALICE due to LHC Injection Kicker Failures

    CERN Document Server

    Shetty, N V; Di Mauro, A; Lechner, A; Leogrande, E; Uythoven, J

    2014-01-01

    The counter-rotating beams of the LHC are injected in insertion regions which also accommodate the ALICE and LHCb experiments. An assembly of beam absorbers ensures the protection of machine elements in case of injection kicker failures, which can affect either the injected or the stored beam. In the first years of LHC operation, secondary particle showers due to beam impact on the injection beam stopper caused damage to the MOS injectors of the ALICE silicon drift detector as well as high-voltage trips in other ALICE subdetectors. In this study, we present FLUKA [1,2] simulations of particle fluences to the ALICE cavern for injection failures encountered during operation. Two different cases are reported, one where the miskicked beam is fully intercepted and one where the beam grazes the beam stopper.

  19. Assessment of the production of medical isotopes using the Monte Carlo code FLUKA: Simulations against experimental measurements

    Energy Technology Data Exchange (ETDEWEB)

    Infantino, Angelo, E-mail: angelo.infantino@unibo.it [Department of Industrial Engineering, Montecuccolino Laboratory, University of Bologna, Via dei Colli 16, 40136 Bologna (Italy); Oehlke, Elisabeth [TRIUMF, 4004 Wesbrook Mall, V6T 2A3 Vancouver, BC (Canada); Department of Radiation Science & Technology, Delft University of Technology, Postbus 5, 2600 AA Delft (Netherlands); Mostacci, Domiziano [Department of Industrial Engineering, Montecuccolino Laboratory, University of Bologna, Via dei Colli 16, 40136 Bologna (Italy); Schaffer, Paul; Trinczek, Michael; Hoehr, Cornelia [TRIUMF, 4004 Wesbrook Mall, V6T 2A3 Vancouver, BC (Canada)

    2016-01-01

    The Monte Carlo code FLUKA is used to simulate the production of a number of positron emitting radionuclides, {sup 18}F, {sup 13}N, {sup 94}Tc, {sup 44}Sc, {sup 68}Ga, {sup 86}Y, {sup 89}Zr, {sup 52}Mn, {sup 61}Cu and {sup 55}Co, on a small medical cyclotron with a proton beam energy of 13 MeV. Experimental data collected at the TR13 cyclotron at TRIUMF agree within a factor of 0.6 ± 0.4 with the directly simulated data, except for the production of {sup 55}Co, where the simulation underestimates the experiment by a factor of 3.4 ± 0.4. The experimental data also agree within a factor of 0.8 ± 0.6 with the convolution of simulated proton fluence and cross sections from literature. Overall, this confirms the applicability of FLUKA to simulate radionuclide production at 13 MeV proton beam energy.

  20. FLUKA and PENELOPE simulations of 10 keV to 10 MeV photons in LYSO and soft tissue

    Science.gov (United States)

    Chin, M. P. W.; Böhlen, T. T.; Fassò, A.; Ferrari, A.; Ortega, P. G.; Sala, P. R.

    2014-02-01

    Monte Carlo simulations of electromagnetic particle interactions and transport by FLUKA and PENELOPE were compared. 10 keV to 10 MeV incident photon beams impinged a LYSO crystal and a soft-tissue phantom. Central-axis as well as off-axis depth doses agreed within 1 s.d.; no systematic under- or over-estimate of the pulse height spectra was observed from 100 keV to 10 MeV for both materials, agreement was within 5%. Simulation of photon and electron transport and interactions at this level of precision and reliability is of significant impact, for instance, on treatment monitoring of hadrontherapy where a code like FLUKA is needed to simulate the full suite of particles and interactions (not just electromagnetic). At the interaction-by-interaction level, apart from known differences in condensed history techniques, two-quanta positron annihilation at rest was found to differ between the two codes. PENELOPE produced a 511 keV sharp line, whereas FLUKA produced visible acolinearity, a feature recently implemented to account for the momentum of shell electrons.

  1. FLUKA and PENELOPE simulations of 10keV to 10MeV photons in LYSO and soft tissue

    CERN Document Server

    Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Sala, P R

    2014-01-01

    Monte Carlo simulations of electromagnetic particle interactions and transport by FLUKA and PENELOPE were compared. 10 key to 10 MeV incident photon beams impinged a LYSO crystal and a soft-tissue phantom. Central-axis as well as off-axis depth doses agreed within 1 s.d.; no systematic under- or overestimate of the pulse height spectra was observed from 100 keV to 10 MeV for both materials, agreement was within 5\\%. Simulation of photon and electron transport and interactions at this level of precision and reliability is of significant impact, for instance, on treatment monitoring of hadrontherapy where a code like FLUKA is needed to simulate the full suite of particles and interactions (not just electromagnetic). At the interaction-by-interaction level, apart from known differences in condensed history techniques, two-quanta positron annihilation at rest was found to differ between the two codes. PENELOPE produced a 511 key sharp line, whereas FLUKA produced visible acolinearity, a feature recently implemen...

  2. FLUKA simulations of the response of tissue-equivalent proportional counters to ion beams for applications in hadron therapy and space.

    Science.gov (United States)

    Böhlen, T T; Dosanjh, M; Ferrari, A; Gudowska, I; Mairani, A

    2011-10-21

    For both cancer therapy with protons and ions (hadron therapy) and space radiation environments, the spatial energy deposition patterns of the radiation fields are of importance for quantifying the resulting radiation damage in biological structures. Tissue-equivalent proportional counters (TEPC) are the principal instruments for measuring imparted energy on a microscopic scale and for characterizing energy deposition patterns of radiation. Moreover, the distribution of imparted energy can serve as a complementary quantity to particle fluences of the primary beam and secondary fragments for characterizing a radiation field on a physical basis for radiobiological models. In this work, the Monte Carlo particle transport code FLUKA is used for simulating energy depositions in TEPC by ion beams. The capability of FLUKA in predicting imparted energy and derived quantities, such as lineal energy, for microscopic volumes is evaluated by comparing it with a large set of TEPC measurements for different ion beams with atomic numbers ranging from 1 to 26 and energies from 80 up to 1000 MeV/n. The influence of different physics configurations in the simulation is also discussed. It is demonstrated that FLUKA can simulate energy deposition patterns of ions in TEPC cavities accurately and that it provides an adequate description of the main features of the spectra.

  3. FLUKA Simulations of DPA in 6H-SiC Reactor Blanket Material Induced by Different Radiation Fields Frequently Mentioned in Literature

    Science.gov (United States)

    Korkut, Turgay; Korkut, Hatun

    2013-02-01

    Silicon carbide (SiC) is used extensively for the production of high-tech semiconductor devices. Today the use of this material in radiation environments such as fusion reactors creates excitement in the nuclear industry. Specific radiation types and energies which semiconductors were frequently exposed are of great value in terms of high-tech device studies. We used FLUKA simulation code to investigate radiation induced effects in 6H-SiC for different energetic protons, neutrons, photons and electrons in this paper. We analyzed displacement per atom values taking account of the simulation results in a very large perspective of radiation type and energy.

  4. Use experience of FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    In order to conduct the shield design calculation of the Large Hadron Collider (LHC) under planning in CERN at present, the radiation group of CERN uses FLUKA (Monte Carlo High Energy Radiation Transport Code). Here is introduced on outline of FLUKA and use experience of FLUKA in the LHC-B detector shield design calculation in LHC plan. FLUKA can be said to be the highest standard in the high energy radiation transportation code of the world at every points of the physical model, the Monte Carlo calculation technique and the convenience at usage of the code. In Japan Atomic Energy Research Institute (JAERI), a using right of FLUKA for the target neutronics and facility shielding design at the neutron science research center is obtained and it seems to be an effective design means in these future designs. However, because FLUKA is allowed a limited opening and no own verification on the code, it will be supposed to be a large problem on investigating a validity in design. (K.G.)

  5. FLUKA shielding calculations for the FAIR project; FLUKA-Abschirmrechnungen fuer das FAIR-Projekt

    Energy Technology Data Exchange (ETDEWEB)

    Fehrenbacher, Georg [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Kozlova, Ekaterina; Radon, Torsten [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Abt. Beschleuniger-Strahlenschutz, Darmstadt (Germany); Sokolov, Alexey [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Abt. Strahlenschutz und Sicherheit, Darmstadt (Germany)

    2015-07-01

    FAIR is an international accelerator project being in construction at GSI Helmholtz center for heavy ion research in Darmstadt. The Monte Carlo program FLUKA is used to study radiation protection problems. The contribution deals with general application possibilities of FLUKA and for FAIR with respect the radiation protection planning. The necessity to simulate the radiation transport through shielding of several meters thickness and to determine the equivalent doses outside the shielding with sufficient accuracy is demonstrated using two examples under consideration of the variance reduction. Results of simulation calculations for activation estimation in accelerator facilities are presented.

  6. Update on the Status of the FLUKA Monte Carlo Transport Code

    Science.gov (United States)

    Pinsky, L.; Anderson, V.; Empl, A.; Lee, K.; Smirnov, G.; Zapp, N; Ferrari, A.; Tsoulou, K.; Roesler, S.; Vlachoudis, V.; Battisoni, G.; Ceruti, F.; Gadioli, M. V.; Garzelli, M.; Muraro, S.; Rancati, T.; Sala, P.; Ballarini, R.; Ottolenghi, A.; Parini, V.; Scannicchio, D.; Pelliccioni, M.; Wilson, T. L.

    2004-01-01

    The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. Here we review the progresses achieved in the last year on the physics models. From the point of view of hadronic physics, most of the effort is still in the field of nucleus--nucleus interactions. The currently available version of FLUKA already includes the internal capability to simulate inelastic nuclear interactions beginning with lab kinetic energies of 100 MeV/A up the the highest accessible energies by means of the DPMJET-II.5 event generator to handle the interactions for greater than 5 GeV/A and rQMD for energies below that. The new developments concern, at high energy, the embedding of the DPMJET-III generator, which represent a major change with respect to the DPMJET-II structure. This will also allow to achieve a better consistency between the nucleus-nucleus section with the original FLUKA model for hadron-nucleus collisions. Work is also in progress to implement a third event generator model based on the Master Boltzmann Equation approach, in order to extend the energy capability from 100 MeV/A down to the threshold for these reactions. In addition to these extended physics capabilities, structural changes to the programs input and scoring capabilities are continually being upgraded. In particular we want to mention the upgrades in the geometry packages, now capable of reaching higher levels of abstraction. Work is also proceeding to provide direct import into ROOT of the FLUKA output files for analysis and to deploy a user-friendly GUI input interface.

  7. CERN Technical Training 2008: Learning for the LHC! FLUKA Workshop 2008: 23-27 June 2008

    CERN Multimedia

    2008-01-01

    http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications, can be found on the official FLUKA website (http://www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginners course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics of FLUKA, as well as offering the possibility to broaden their knowledge about t...

  8. CERN Technical Training 2008: Learning for the LHC! FLUKA Workshop 2008: 23-27 June 2008

    CERN Multimedia

    2008-01-01

    http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications, can be found on the official FLUKA website (http://www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginner’s course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics of FLUKA, as well as offering the possibility to broaden their knowledge abou...

  9. Beam Loss Studies for the CERN PS Booster Using FLUKA

    CERN Document Server

    Damjanovic, S; Mikulec, B; Sapinski, M

    2013-01-01

    In view of future upgrade plans, the beam loss monitor (BLM) coverage of the four PS Booster (PSB) rings was reviewed. FLUKA studies at Linac4 injection and PSB extraction energies were performed to simulate the loss patterns. The results of these studies, presented in this paper, have led to the proposal to double the number of beam loss monitors in the PSB.

  10. The Hadronic Models for Cosmic Ray Physics: the FLUKA Code Solutions

    Energy Technology Data Exchange (ETDEWEB)

    Battistoni, G.; Garzelli, M.V.; Gadioli, E.; Muraro, S.; Sala, P.R.; Fasso, A.; Ferrari, A.; Roesler, S.; Cerutti, F.; Ranft, J.; Pinsky, L.S.; Empl, A.; Pelliccioni, M.; Villari, R.; /INFN, Milan /Milan U. /SLAC /CERN /Siegen U. /Houston U. /Frascati /ENEA, Frascati

    2007-01-31

    FLUKA is a general purpose Monte Carlo transport and interaction code used for fundamental physics and for a wide range of applications. These include Cosmic Ray Physics (muons, neutrinos, EAS, underground physics), both for basic research and applied studies in space and atmospheric flight dosimetry and radiation damage. A review of the hadronic models available in FLUKA and relevant for the description of cosmic ray air showers is presented in this paper. Recent updates concerning these models are discussed. The FLUKA capabilities in the simulation of the formation and propagation of EM and hadronic showers in the Earth's atmosphere are shown.

  11. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Moskvin, V; Tsiamas, P; Axente, M; Farr, J [St. Jude Children’s Research Hospital, Memphis, TN (United States); Stewart, R [University of Washington, Seattle, WA. (United States)

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.

  12. FLUKA: A Multi-Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan; Fasso, A.; /SLAC; Ranft, J.; /Siegen U.

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  13. Minimizing the background radiation in the new neutron time-of-flight facility at CERN FLUKA Monte Carlo simulations for the optimization of the n_TOF second experimental line

    CERN Document Server

    Bergström, Ida; Elfgren, Erik

    2013-06-11

    At the particle physics laboratory CERN in Geneva, Switzerland, the Neutron Time-of-Flight facility has recently started the construction of a second experimental line. The new neutron beam line will unavoidably induce radiation in both the experimental area and in nearby accessible areas. Computer simulations for the minimization of the background were carried out using the FLUKA Monte Carlo simulation package. The background radiation in the new experimental area needs to be kept to a minimum during measurements. This was studied with focus on the contributions from backscattering in the beam dump. The beam dump was originally designed for shielding the outside area using a block of iron covered in concrete. However, the backscattering was never studied in detail. In this thesis, the fluences (i.e. the flux integrated over time) of neutrons and photons were studied in the experimental area while the beam dump design was modified. An optimized design was obtained by stopping the fast neutrons in a high Z mat...

  14. Determination and Fabrication of New Shield Super Alloys Materials for Nuclear Reactor Safety by Experiments and Cern-Fluka Monte Carlo Simulation Code, Geant4 and WinXCom

    Science.gov (United States)

    Aygun, Bünyamin; Korkut, Turgay; Karabulut, Abdulhalik

    2016-05-01

    Despite the possibility of depletion of fossil fuels increasing energy needs the use of radiation tends to increase. Recently the security-focused debate about planned nuclear power plants still continues. The objective of this thesis is to prevent the radiation spread from nuclear reactors into the environment. In order to do this, we produced higher performanced of new shielding materials which are high radiation holders in reactors operation. Some additives used in new shielding materials; some of iron (Fe), rhenium (Re), nickel (Ni), chromium (Cr), boron (B), copper (Cu), tungsten (W), tantalum (Ta), boron carbide (B4C). The results of this experiments indicated that these materials are good shields against gamma and neutrons. The powder metallurgy technique was used to produce new shielding materials. CERN - FLUKA Geant4 Monte Carlo simulation code and WinXCom were used for determination of the percentages of high temperature resistant and high-level fast neutron and gamma shielding materials participated components. Super alloys was produced and then the experimental fast neutron dose equivalent measurements and gamma radiation absorpsion of the new shielding materials were carried out. The produced products to be used safely reactors not only in nuclear medicine, in the treatment room, for the storage of nuclear waste, nuclear research laboratories, against cosmic radiation in space vehicles and has the qualities.

  15. Helium ions at the heidelberg ion beam therapy center: comparisons between FLUKA Monte Carlo code predictions and dosimetric measurements

    Science.gov (United States)

    Tessonnier, T.; Mairani, A.; Brons, S.; Sala, P.; Cerutti, F.; Ferrari, A.; Haberer, T.; Debus, J.; Parodi, K.

    2017-08-01

    In the field of particle therapy helium ion beams could offer an alternative for radiotherapy treatments, owing to their interesting physical and biological properties intermediate between protons and carbon ions. We present in this work the comparisons and validations of the Monte Carlo FLUKA code against in-depth dosimetric measurements acquired at the Heidelberg Ion Beam Therapy Center (HIT). Depth dose distributions in water with and without ripple filter, lateral profiles at different depths in water and a spread-out Bragg peak were investigated. After experimentally-driven tuning of the less known initial beam characteristics in vacuum (beam lateral size and momentum spread) and simulation parameters (water ionization potential), comparisons of depth dose distributions were performed between simulations and measurements, which showed overall good agreement with range differences below 0.1 mm and dose-weighted average dose-differences below 2.3% throughout the entire energy range. Comparisons of lateral dose profiles showed differences in full-width-half-maximum lower than 0.7 mm. Measurements of the spread-out Bragg peak indicated differences with simulations below 1% in the high dose regions and 3% in all other regions, with a range difference less than 0.5 mm. Despite the promising results, some discrepancies between simulations and measurements were observed, particularly at high energies. These differences were attributed to an underestimation of dose contributions from secondary particles at large angles, as seen in a triple Gaussian parametrization of the lateral profiles along the depth. However, the results allowed us to validate FLUKA simulations against measurements, confirming its suitability for 4He ion beam modeling in preparation of clinical establishment at HIT. Future activities building on this work will include treatment plan comparisons using validated biological models between proton and helium ions, either within a Monte Carlo

  16. Experiments and FLUKA simulations of $^{12}C$ and $^{16}O$ beams for therapy monitoring by means of in-beam Positron Emission Tomography

    CERN Document Server

    Sommerer,; Ferrari, A

    2007-01-01

    Since 1997 at the experimental C-12 ion therapy facility at Gesellschaft fuer Schwerionenforschung (GSI), Darmstadt, Germany, more than 350 patients have been treated. The therapy is monitored with a dedicated positron emission tomograph, fully integrated into the treatment site. The measured beta+-activity arises from inelastic nuclear interactions between the beam particles an the nuclei of the patients tissue. Because the monitoring is done during the irradiation the method is called in-beam PET. The underlying principle of this monitoring is a comparison between the measured activity and a simulated one. The simulations are presently done by the PETSIM code which is dedicated to C-12 beams. In future ion therapy centers like the Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg, Germany, besides C-12 also proton, $^3$He and O-16 beams will be used for treatment and the therapy will be monitored by means of in-beam PET. Because PETSIM is not extendable to other ions in an easy way, a code capable ...

  17. FLUKA Calculation of the Neutron Albedo Encountered at Low Earth Orbits

    CERN Document Server

    Claret, Arnaud; Combier, Natacha; Ferrari, Alfredo; Laurent, Philippe

    2014-01-01

    This paper presents Monte-Carlo simulations based on the Fluka code aiming to calculate the contribution of the neutron albedo at a given date and altitude above the Earth chosen by the user. The main input parameters of our model are the solar modulation affecting the spectra of cosmic rays, and the date of the Earth’s geomagnetic fi eld. The results consist in a two-parameter distribution, the neutron energy and the angle to the tangent plane of the sphere containing the orbi t of interest, and are provided by geographical position above the E arth at the chosen altitude. This model can be used to predict the te mporal variation of the neutron fl ux encountered along the orbit, and thus constrain the determination of the instrumental backg round noise of space experiments in low earth orbit.

  18. FLUKA studies of hadron-irradiated scintillating crystals for calorimetry at the High-Luminosity LHC

    Science.gov (United States)

    Quittnat, Milena; CMS Collaboration

    2015-02-01

    Calorimetry at the High-Luminosity LHC (HL-LHC) will be performed in a harsh radiation environment with high hadron fluences. The upgraded CMS electromagnetic calorimeter design and suitable scintillating materials are a focus of current research. In this paper, first results using the Monte Carlo simulation program FLUKA are compared to measurements performed with proton-irradiated LYSO, YSO and cerium fluoride crystals. Based on these results, an extrapolation to the behavior of an electromagnetic sampling calorimeter, using one of the inorganic scintillators above as an active medium, is performed for the upgraded CMS experiment at the HL-LHC. Characteristic parameters such as the induced ambient dose, fluence spectra for different particle types and the residual nuclei are studied, and the suitability of these materials for a future calorimeter is surveyed. Particular attention is given to the creation of isotopes in an LYSO-tungsten calorimeter that might contribute a prohibitive background to the measured signal.

  19. FLUKA studies of hadron-irradiated scintillating crystals for calorimetry at the High-Luminosity LHC

    CERN Document Server

    Quittnat, Milena Eleonore

    2015-01-01

    Calorimetry at the High-Luminosity LHC (HL-LHC) will be performed in a harsh radiation environment with high hadron fluences. The upgraded CMS electromagnetic calorimeter design and suitable scintillating materials are a focus of current research. In this paper, first results using the Monte Carlo simulation program FLUKA are compared to measurements performed with proton-irradiated LYSO, YSO and cerium fluoride crystals. Based on these results, an extrapolation to the behavior of an electromagnetic sampling calorimeter, using one of the inorganic scintillators above as an active medium, is performed for the upgraded CMS experiment at the HL-LHC. Characteristic parameters such as the induced ambient dose, fluence spectra for different particle types and the residual nuclei are studied, and the suitability of these materials for a future calorimeter is surveyed. Particular attention is given to the creation of isotopes in an LYSO-tungsten calorimeter that might contribute a prohibitive background to the measu...

  20. The FLUKA Model of IR8

    CERN Document Server

    Appleby, R B

    2010-01-01

    The study of machine induced background (MIB), the radiation environment and beam dynamics of the LHC requires a detailed model of the machine tunnel, elements and electromagnetic fields. In this note, a specially created model of IR8 in FLUKA is described, including the tunnel, vacuum chambers, magnets, collimators, injection elements and shielding. The inclusion of all relevant machine elements in the LSS of IR8 results in a very flexible model suitable for a large variety of calculations and studies. The validation of the model is discussed, and some example applications described.

  1. ALIFE: A Geometry Editor and Parser for FLUKA

    CERN Document Server

    CERN. Geneva

    1998-01-01

    ALIFE is an editor and parser for the FLUKA geometry and detector definitions written in the Tcl/Tk programming language. Its syntax simplifies the preparation and maintenance of FLUKA input cards for geometry and materials as well as tracking and scoring options. It supports a modular structure for the geometry definition providing a very efficient way for several users to work on the same geometry.

  2. The Application of FLUKA to Dosimetry and Radiation Therapy

    Science.gov (United States)

    Wilson, Thomas L.; Andersen, Victor; Pinsky, Lawrence; Ferrari, Alfredo; Battistoni, Giusenni

    2005-01-01

    Monte Carlo transport codes like FLUKA are useful for many purposes, and one of those is the simulation of the effects of radiation traversing the human body. In particular, radiation has been used in cancer therapy for a long time, and recently this has been extended to include heavy ion particle beams. The advent of this particular type of therapy has led to the need for increased capabilities in the transport codes used to simulate the detailed nature of the treatment doses to the Y O U S tissues that are encountered. This capability is also of interest to NASA because of the nature of the radiation environment in space.[l] While in space, the crew members bodies are continually being traversed by virtually all forms of radiation. In assessing the risk that this exposure causes, heavy ions are of primary importance. These arise both from the primary external space radiation itself, as well as fragments that result from interactions during the traversal of that radiation through any intervening material including intervening body tissue itself. Thus the capability to characterize the details of the radiation field accurately within a human body subjected to such external 'beams" is of critical importance.

  3. Inter-comparison of MARS and FLUKA: Predictions on Energy Deposition in LHC IR Quadrupoles

    CERN Document Server

    Hoa, C; Cerutti, F; Ferrai, A

    2008-01-01

    Detailed modellings of the LHC insertion regions (IR) have earlier been performed to evaluate energy deposition in the IR superconducting magnets [1-4]. Proton-proton collisions at 14 TeV in the centre of mass lead to debris, depositing energy in the IR components. To evaluate uncertainties in those simulations and gain further confidence in the tools and approaches used, inter-comparison calculations have been performed with the latest versions of the FLUKA (2006.3b) [5, 6] and MARS15 [7, 8] Monte Carlo codes. These two codes, used worldwide for multi particle interaction and transport in accelerator, detector and shielding components, have been thoroughly benchmarked by the code authors and the user community (see, for example, recent [9, 10]). In the study described below, a better than 5% agreement was obtained for energy deposition calculated with these two codes - based on different independent physics models - for the identical geometry and initial conditions of a simple model representing the IR5 and ...

  4. Fluka and thermo-mechanical studies for the CLIC main dump

    CERN Document Server

    Mereghetti, Alessio; Vlachoudis, Vasilis

    2011-01-01

    In order to best cope with the challenge of absorbing the multi-MW beam, a water beam dump at the end of the CLIC post-collision line has been proposed. The design of the dump for the Conceptual Design Report (CDR) was checked against with a set of FLUKA Monte Carlo simulations, for the estimation of the peak and total power absorbed by the water and the vessel. Fluence spectra of escaping particles and activation rates of radio-nuclides were computed as well. Finally, the thermal transient behavior of the water bath and a thermo-mechanical analysis of the preliminary design of the window were done.

  5. The FLUKA study of the secondary particles fluence in the AD-Antiproton Decelerator target area

    CERN Document Server

    Calviani, M

    2014-01-01

    In this paper we present Monte Carlo FLUKA simulations [1, 2] carried out to investigate the secondary particles fluence emerging from the antiproton production target and their spatial distribution in the AD target area. The detailed quantitative analysis has been performed for different positions along the magnet dog-leg as well as after the main collimator. These results allow tuning the position of the new beam current transformers (BCT) in the target area, in order to have a precise pulse-by-pulse evaluation of the intensity of negative particles injected in the AD-ring before the deceleration phase.

  6. Calculation of electron and isotopes dose point kernels with FLUKA Monte Carlo code for dosimetry in nuclear medicine therapy.

    Science.gov (United States)

    Botta, F; Mairani, A; Battistoni, G; Cremonesi, M; Di Dia, A; Fassò, A; Ferrari, A; Ferrari, M; Paganelli, G; Pedroli, G; Valente, M

    2011-07-01

    The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, FLUKA Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, FLUKA has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. FLUKA DPKS have been calculated in both water and compact bone for monoenergetic electrons (10-3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. FLUKA outcomes have been compared to PENELOPE v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (ETRAN, GEANT4, MCNPX) has been done. Maximum percentage differences within 0.8.RCSDA and 0.9.RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8.X90 and 0.9.X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9.RCSDA and 0.9.X90 for electrons and isotopes, respectively. Concerning monoenergetic electrons, within 0.8.RCSDA (where 90%-97% of the particle energy is deposed), FLUKA and PENELOPE agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The

  7. Calculation of electron and isotopes dose point kernels with fluka Monte Carlo code for dosimetry in nuclear medicine therapy

    Energy Technology Data Exchange (ETDEWEB)

    Botta, F; Di Dia, A; Pedroli, G; Mairani, A; Battistoni, G; Fasso, A; Ferrari, A; Ferrari, M; Paganelli, G

    2011-06-01

    The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one.Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10–3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I, 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8·RCSDA and 0.9·RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8·X90 and 0.9·X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9·RCSDA and 0.9·X90 for electrons and isotopes, respectively.Results: Concerning monoenergetic electrons, within 0.8·RCSDA (where 90%–97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8

  8. Calculation of electron and isotopes dose point kernels with fluka Monte Carlo code for dosimetry in nuclear medicine therapy

    Energy Technology Data Exchange (ETDEWEB)

    Botta, F.; Mairani, A.; Battistoni, G.; Cremonesi, M.; Di Dia, A.; Fasso, A.; Ferrari, A.; Ferrari, M.; Paganelli, G.; Pedroli, G.; Valente, M. [Medical Physics Department, European Institute of Oncology, Via Ripamonti 435, 20141 Milan (Italy); Istituto Nazionale di Fisica Nucleare (I.N.F.N.), Via Celoria 16, 20133 Milan (Italy); Medical Physics Department, European Institute of Oncology, Via Ripamonti 435, 20141 Milan (Italy); Jefferson Lab, 12000 Jefferson Avenue, Newport News, Virginia 23606 (United States); CERN, 1211 Geneva 23 (Switzerland); Medical Physics Department, European Institute of Oncology, Milan (Italy); Nuclear Medicine Department, European Institute of Oncology, Via Ripamonti 435, 2014 Milan (Italy); Medical Physics Department, European Institute of Oncology, Via Ripamonti 435, 20141 Milan (Italy); FaMAF, Universidad Nacional de Cordoba and CONICET, Cordoba, Argentina C.P. 5000 (Argentina)

    2011-07-15

    Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10{sup -3} MeV) and for beta emitting isotopes commonly used for therapy ({sup 89}Sr, {sup 90}Y, {sup 131}I, {sup 153}Sm, {sup 177}Lu, {sup 186}Re, and {sup 188}Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8{center_dot}R{sub CSDA} and 0.9{center_dot}R{sub CSDA} for monoenergetic electrons (R{sub CSDA} being the continuous slowing down approximation range) and within 0.8{center_dot}X{sub 90} and 0.9{center_dot}X{sub 90} for isotopes (X{sub 90} being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9{center_dot}R{sub CSDA} and 0.9{center_dot}X{sub 90} for electrons and isotopes, respectively. Results: Concerning monoenergetic electrons

  9. Calculation of electron and isotopes dose point kernels with FLUKA Monte Carlo code for dosimetry in nuclear medicine therapy

    CERN Document Server

    Mairani, A; Valente, M; Battistoni, G; Botta, F; Pedroli, G; Ferrari, A; Cremonesi, M; Di Dia, A; Ferrari, M; Fasso, A

    2011-01-01

    Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, FLUKA Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, FLUKA has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: FLUKA DPKS have been calculated in both water and compact bone for monoenergetic electrons (10-3 MeV) and for beta emitting isotopes commonly used for therapy ((89)Sr, (90)Y, (131)I, (153)Sm, (177)Lu, (186)Re, and (188)Re). Point isotropic...

  10. Update On the Status of the FLUKA Monte Carlo Transport Code*

    Science.gov (United States)

    Ferrari, A.; Lorenzo-Sentis, M.; Roesler, S.; Smirnov, G.; Sommerer, F.; Theis, C.; Vlachoudis, V.; Carboni, M.; Mostacci, A.; Pelliccioni, M.

    2006-01-01

    The FLUKA Monte Carlo transport code is a well-known simulation tool in High Energy Physics. FLUKA is a dynamic tool in the sense that it is being continually updated and improved by the authors. We review the progress achieved since the last CHEP Conference on the physics models, some technical improvements to the code and some recent applications. From the point of view of the physics, improvements have been made with the extension of PEANUT to higher energies for p, n, pi, pbar/nbar and for nbars down to the lowest energies, the addition of the online capability to evolve radioactive products and get subsequent dose rates, upgrading of the treatment of EM interactions with the elimination of the need to separately prepare preprocessed files. A new coherent photon scattering model, an updated treatment of the photo-electric effect, an improved pair production model, new photon cross sections from the LLNL Cullen database have been implemented. In the field of nucleus-- nucleus interactions the electromagnetic dissociation of heavy ions has been added along with the extension of the interaction models for some nuclide pairs to energies below 100 MeV/A using the BME approach, as well as the development of an improved QMD model for intermediate energies. Both DPMJET 2.53 and 3 remain available along with rQMD 2.4 for heavy ion interactions above 100 MeV/A. Technical improvements include the ability to use parentheses in setting up the combinatorial geometry, the introduction of pre-processor directives in the input stream. a new random number generator with full 64 bit randomness, new routines for mathematical special functions (adapted from SLATEC). Finally, work is progressing on the deployment of a user-friendly GUI input interface as well as a CAD-like geometry creation and visualization tool. On the application front, FLUKA has been used to extensively evaluate the potential space radiation effects on astronauts for future deep space missions, the activation

  11. Application of FLUKA to simulate the beam loss monitoring system for the booster of the Shanghai synchrotron radiation facility%FLUKA在上海光源增强器束流损失监测系统模拟中的应用

    Institute of Scientific and Technical Information of China (English)

    任丽; 邱睿; 曾鸣; 李君利; 邵贝贝

    2011-01-01

    上海光源(SSRF)是中国迄今为止最大的大科学装置,在科学界和工业界有着广泛的应用价值。上海光源增强器的束流损失监测系统,对于保障中国首台增强器的正常运行和机器研究都起到重要的作用。针对该束流损失监测系统设计中相关的问题,利用FLUKA程序,对不同能量的束损电子在真空室壁中的簇射过程进行了模拟,并比较了次级粒子在探测器内的能量沉积。结果表明:探测器中对能量沉积起主导作用的是簇射电子,设计方案是可行的。同时讨论了在不同能量束流电子损失下,探测器测量结果的一致性问题。%The Shanghai Synchrotron Radiation Facility is by far the largest scientific facility in China,with wide applications in the scientific community and industry.The beam loss monitoring(BLM) system of the Shanghai Synchrotron Radiation Facility booster plays an important role in protecting normal operation of the first booster in China.Problems in the BLM system design were analyzed using the FLUKA code to simulate the beam electron shower with different kinetic energies in the vacuum chamber wall.The energy deposition of the secondary particles in the detector was compared to show that the shower electrons deposit the most energy in the detector,which can firms the design feasibility.The beam loss measurements for beam with different kinetic energies were also analyzed.

  12. ANS Based Submarine Simulation

    Science.gov (United States)

    1994-08-01

    computer based simulation proraon supplied by Dr. John Ware at Computer Sceinces Corporation (CSC). Thee am two reasons to use simulated data instead...ANS (Artificial Neural System) capable of modeling submarine perfomncie based on full scale data generated using a computer based simulabon program...The Optimized Entropy algorilth enables the solutions to diffcu problems on a desktop computer within an acceptable time frame. Ob6ectve for w

  13. FLUKA predictions of the absorbed dose in the HCAL Endcap scintillators using a Run1 (2012) CMS FLUKA model

    CERN Document Server

    CMS Collaboration

    2016-01-01

    Estimates of absorbed dose in HCAL Endcap (HE) region as predicted by FLUKA Monte Carlo code. Dose is calculated in an R-phi-Z grid overlaying HE region, with resolution 1cm in R, 1mm in Z, and a single 360 degree bin in phi. This allows calculation of absorbed dose within a single 4mm thick scintillator layer without including other regions or materials. This note shows estimates of the cumulative dose in scintillator layers 1 and 7 during the 2012 run.

  14. Applications of FLUKA Monte Carlo code for nuclear and accelerator physics

    CERN Document Server

    Battistoni, Giuseppe; Brugger, Markus; Campanella, Mauro; Carboni, Massimo; Empl, Anton; Fasso, Alberto; Gadioli, Ettore; Cerutti, Francesco; Ferrari, Alfredo; Ferrari, Anna; Lantz, Matthias; Mairani, Andrea; Margiotta, M; Morone, Christina; Muraro, Silvia; Parodi, Katerina; Patera, Vincenzo; Pelliccioni, Maurizio; Pinsky, Lawrence; Ranft, Johannes; Roesler, Stefan; Rollet, Sofia; Sala, Paola R; Santana, Mario; Sarchiapone, Lucia; Sioli, Maximiliano; Smirnov, George; Sommerer, Florian; Theis, Christian; Trovati, Stefania; Villari, R; Vincke, Heinz; Vincke, Helmut; Vlachoudis, Vasilis; Vollaire, Joachim; Zapp, Neil

    2011-01-01

    FLUKA is a general purpose Monte Carlo code capable of handling all radiation components from thermal energies (for neutrons) or 1keV (for all other particles) to cosmic ray energies and can be applied in many different fields. Presently the code is maintained on Linux. The validity of the physical models implemented in FLUKA has been benchmarked against a variety of experimental data over a wide energy range, from accelerator data to cosmic ray showers in the Earth atmosphere. FLUKA is widely used for studies related both to basic research and to applications in particle accelerators, radiation protection and dosimetry, including the specific issue of radiation damage in space missions, radiobiology (including radiotherapy) and cosmic ray calculations. After a short description of the main features that make FLUKA valuable for these topics, the present paper summarizes some of the recent applications of the FLUKA Monte Carlo code in the nuclear as well high energy physics. In particular it addresses such top...

  15. The FLUKA code for space applications Recent developments

    CERN Document Server

    Andersen, V; Battistoni, G; Campanella, M; Carboni, M; Cerutti, F; Empl, A; Fassò, A; Ferrari, A; Gadioli, E; Garzelli, M V; Lee, K; Ottolenghi, A; Pelliccioni, M; Pinsky, L S; Ranft, J; Roesler, S; Sala, P R; Wilson, T L

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to- date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results h...

  16. Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

    Energy Technology Data Exchange (ETDEWEB)

    Ronningen, Reginald Martin [Michigan State University; Remec, Igor [Oak Ridge National Laboratory; Heilbronn, Lawrence H. [University of Tennessee-Knoxville

    2013-06-07

    Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS that could be used for design simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the Report of the 2003 RIA R&D Workshop".

  17. Benchmark of the FLUKA model of crystal channeling against the UA9-H8 experiment

    Science.gov (United States)

    Schoofs, P.; Cerutti, F.; Ferrari, A.; Smirnov, G.

    2015-07-01

    Channeling in bent crystals is increasingly considered as an option for the collimation of high-energy particle beams. The installation of crystals in the LHC has taken place during this past year and aims at demonstrating the feasibility of crystal collimation and a possible cleaning efficiency improvement. The performance of CERN collimation insertions is evaluated with the Monte Carlo code FLUKA, which is capable to simulate energy deposition in collimators as well as beam loss monitor signals. A new model of crystal channeling was developed specifically so that similar simulations can be conducted in the case of crystal-assisted collimation. In this paper, most recent results of this model are brought forward in the framework of a joint activity inside the UA9 collaboration to benchmark the different simulation tools available. The performance of crystal STF 45, produced at INFN Ferrara, was measured at the H8 beamline at CERN in 2010 and serves as the basis to the comparison. Distributions of deflected particles are shown to be in very good agreement with experimental data. Calculated dechanneling lengths and crystal performance in the transition region between amorphous regime and volume reflection are also close to the measured ones.

  18. FLUKA Monte Carlo for Basic Dosimetric Studies of Dual Energy Medical Linear Accelerator

    Directory of Open Access Journals (Sweden)

    K. Abdul Haneefa

    2014-01-01

    Full Text Available General purpose Monte Carlo code for simulation of particle transport is used to study the basic dosimetric parameters like percentage depth dose and dose profiles and compared with the experimental measurements from commercial dual energy medical linear accelerator. Varian Clinac iX medical linear accelerator with dual energy photon beams (6 and 15 MV is simulated using FLUKA. FLAIR is used to visualize and edit the geometry. Experimental measurements are taken for 100 cm source-to-surface (SSD in 50 × 50 × 50 cm3 PTW water phantom using 0.12 cc cylindrical ionization chamber. Percentage depth dose for standard square field sizes and dose profiles for various depths are studied in detail. The analysis was carried out using ROOT (a DATA analysis frame work developed at CERN system. Simulation result shows good agreement in percentage depth dose and beam profiles with the experimental measurements for Varian Clinac iX dual energy medical linear accelerator.

  19. An integral test of FLUKA nuclear models with 160 MeV proton beams in multi-layer Faraday cups

    CERN Document Server

    Rinaldi, I; Parodi, K; Ferrari, A; Sala, P; Mairani, A

    2011-01-01

    Monte Carlo (MC) codes are useful tools to simulate the complex processes of proton beam interactions with matter. In proton therapy, nuclear reactions influence the dose distribution. Therefore, the validation of nuclear models adopted in MC codes is a critical requisite for their use in this field. A simple integral test can be performed using a multi-layer Faraday cup (MLFC). This method allows separation of the nuclear and atomic interaction processes, which are responsible for secondary particle emission and the finite primary proton range, respectively. In this work, the propagation of 160 MeV protons stopping in two MLFCs made of polyethylene and copper has been simulated by the FLUKA MC code. The calculations have been performed with and without secondary electron emission and transport, as well as charge sharing in the dielectric layers. Previous results with other codes neglected those two effects. The impact of this approximation has been investigated and found to be relevant only in the proximity ...

  20. Benchmarking nuclear models of FLUKA and GEANT4 for carbon ion therapy

    Energy Technology Data Exchange (ETDEWEB)

    Boehlen, T T; Cerutti, F; Dosanjh, M; Ferrari, A [European Organization for Nuclear Research CERN, CH-1211, Geneva 23 (Switzerland); Gudowska, I [Medical Radiation Physics, Karolinska Institutet and Stockholm University, Box 260 S-171 76 Stockholm (Sweden); Mairani, A [INFN Milan, Via Celoria 16, 20133 Milan (Italy); Quesada, J M, E-mail: Till.Tobias.Boehlen@cern.c [Departamento de Fisica Atomica, Molecular y Nuclear, Universidad de Sevilla (Spain)

    2010-10-07

    As carbon ions, at therapeutic energies, penetrate tissue, they undergo inelastic nuclear reactions and give rise to significant yields of secondary fragment fluences. Therefore, an accurate prediction of these fluences resulting from the primary carbon interactions is necessary in the patient's body in order to precisely simulate the spatial dose distribution and the resulting biological effect. In this paper, the performance of nuclear fragmentation models of the Monte Carlo transport codes, FLUKA and GEANT4, in tissue-like media and for an energy regime relevant for therapeutic carbon ions is investigated. The ability of these Monte Carlo codes to reproduce experimental data of charge-changing cross sections and integral and differential yields of secondary charged fragments is evaluated. For the fragment yields, the main focus is on the consideration of experimental approximations and uncertainties such as the energy measurement by time-of-flight. For GEANT4, the hadronic models G4BinaryLightIonReaction and G4QMD are benchmarked together with some recently enhanced de-excitation models. For non-differential quantities, discrepancies of some tens of percent are found for both codes. For differential quantities, even larger deviations are found. Implications of these findings for the therapeutic use of carbon ions are discussed.

  1. Concrete shielding of neutron radiations of plasma focus and dose examination by FLUKA

    Science.gov (United States)

    Nemati, M. J.; Amrollahi, R.; Habibi, M.

    2013-07-01

    Plasma Focus (PF) is among those devices which are used in plasma investigations, but this device produces some dangerous radiations after each shot, which generate a hazardous area for the operators of this device; therefore, it is better for the operators to stay away as much as possible from the area, where plasma focus has been placed. In this paper FLUKA Monte Carlo simulation has been used to calculate radiations produced by a 4 kJ Amirkabir plasma focus device through different concrete shielding concepts with various thicknesses (square, labyrinth and cave concepts). The neutron yield of Amirkabir plasma focus at varying deuterium pressure (3-9 torr) and two charging voltages (11.5 and 13.5 kV) is (2.25 ± 0.2) × 108 neutrons/shot and (2.88 ± 0.29) × 108 neutrons/shot of 2.45 MeV, respectively. The most influential shield for the plasma focus device among these geometries is the labyrinth concept on four sides and the top with 20 cm concrete.

  2. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    Energy Technology Data Exchange (ETDEWEB)

    Böhlen, T.T.; Cerutti, F.; Chin, M.P.W. [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland); Fassò, A. [ELI Beamlines, Harfa Office Park Ceskomoravská 2420/15a, 190 93 Prague 9 (Czech Republic); Ferrari, A., E-mail: alfredo.ferrari@cern.ch [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland); Ortega, P.G. [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland); Mairani, A. [Unità di Fisica Medica, Fondazione CNAO, I-27100 Pavia (Italy); Sala, P.R. [Istituto Nazionale di Fisica Nucleare, Sezione di Milano, Via Celoria 16, I-20133 Milano (Italy); Smirnov, G.; Vlachoudis, V. [European Laboratory for Particle Physics (CERN), CH-1211 Geneva 23 (Switzerland)

    2014-06-15

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  3. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    CERN Document Server

    Böhlen, T T; Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Mairani, A; Sala, P R; Smirnov, G; Vlachoudis, V

    2014-01-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  4. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    Science.gov (United States)

    Böhlen, T. T.; Cerutti, F.; Chin, M. P. W.; Fassò, A.; Ferrari, A.; Ortega, P. G.; Mairani, A.; Sala, P. R.; Smirnov, G.; Vlachoudis, V.

    2014-06-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  5. Technical Description of the implementation of IR7 section at LHC with the FLUKA transport code.

    CERN Document Server

    Brugger, M; Ferrari, A; Magistris, M; Santana-Leitner, M; Tsoulou, A; Vlachoudis, V; CERN. Geneva. AB Department

    2006-01-01

    This document contains the technical description of the LHC IR7 FLUKA implementation. It has been written as a handbook to analyze, understand or modify the heat deposition Monte Carlo calculations performed for a wide variety of objects in the IR7 section of the LHC accelerator, in construction at CERN. The work includes references to the prototyping schemes and the implementation of a complex set-up for FLUKA, which deals with lists of objects and properties defined in the Twiss parameters through the use of the LATTICE concept and of a broad collection of user written subroutines.

  6. Epistemology of knowledge based simulation

    Energy Technology Data Exchange (ETDEWEB)

    Reddy, R.

    1987-04-01

    Combining artificial intelligence concepts, with traditional simulation methodologies yields a powerful design support tool known as knowledge based simulation. This approach turns a descriptive simulation tool into a prescriptive tool, one which recommends specific goals. Much work in the area of general goal processing and explanation of recommendations remains to be done.

  7. Simulation-based surgical education.

    Science.gov (United States)

    Evgeniou, Evgenios; Loizou, Peter

    2013-09-01

    The reduction in time for training at the workplace has created a challenge for the traditional apprenticeship model of training. Simulation offers the opportunity for repeated practice in a safe and controlled environment, focusing on trainees and tailored to their needs. Recent technological advances have led to the development of various simulators, which have already been introduced in surgical training. The complexity and fidelity of the available simulators vary, therefore depending on our recourses we should select the appropriate simulator for the task or skill we want to teach. Educational theory informs us about the importance of context in professional learning. Simulation should therefore recreate the clinical environment and its complexity. Contemporary approaches to simulation have introduced novel ideas for teaching teamwork, communication skills and professionalism. In order for simulation-based training to be successful, simulators have to be validated appropriately and integrated in a training curriculum. Within a surgical curriculum, trainees should have protected time for simulation-based training, under appropriate supervision. Simulation-based surgical education should allow the appropriate practice of technical skills without ignoring the clinical context and must strike an adequate balance between the simulation environment and simulators.

  8. The Fluka Linebuilder and Element Database: Tools for Building Complex Models of Accelerators Beam Lines

    CERN Document Server

    Mereghetti, A; Cerutti, F; Versaci, R; Vlachoudis, V

    2012-01-01

    Extended FLUKA models of accelerator beam lines can be extremely complex: heavy to manipulate, poorly versatile and prone to mismatched positioning. We developed a framework capable of creating the FLUKA model of an arbitrary portion of a given accelerator, starting from the optics configuration and a few other information provided by the user. The framework includes a builder (LineBuilder), an element database and a series of configuration and analysis scripts. The LineBuilder is a Python program aimed at dynamically assembling complex FLUKA models of accelerator beam lines: positions, magnetic fields and scorings are automatically set up, and geometry details such as apertures of collimators, tilting and misalignment of elements, beam pipes and tunnel geometries can be entered at user’s will. The element database (FEDB) is a collection of detailed FLUKA geometry models of machine elements. This framework has been widely used for recent LHC and SPS beam-machine interaction studies at CERN, and led to a dra...

  9. Applications of the lahet simulation code to relativistic heavy ion detectors

    Energy Technology Data Exchange (ETDEWEB)

    Waters, L.; Gavron, A. [Los Alamos National Lab., NM (United States)

    1991-12-31

    The Los Alamos High Energy Transport (LAHET) simulation code has been applied to test beam data from the lead/scintillator Participant Calorimeter of BNL AGS experiment E814. The LAHET code treats hadronic interactions with the LANL version of the Oak Ridge code HETC. LAHET has now been expanded to handle hadrons with kinetic energies greater than 5 GeV with the FLUKA code, while HETC is used exclusively below 2.0 GeV. FLUKA is phased in linearly between 2.0 and 5.0 GeV. Transport of electrons and photons is done with EGS4, and an interface to the Los Alamos HMCNP3B library based code is provided to analyze neutrons with kinetic energies less than 20 MeV. Excellent agreement is found between the test data and simulation, and results for 2.46 GeV/c protons and pions are illustrated in this article.

  10. Applications of the lahet simulation code to relativistic heavy ion detectors

    Energy Technology Data Exchange (ETDEWEB)

    Waters, L.; Gavron, A. [Los Alamos National Lab., NM (United States)

    1991-12-31

    The Los Alamos High Energy Transport (LAHET) simulation code has been applied to test beam data from the lead/scintillator Participant Calorimeter of BNL AGS experiment E814. The LAHET code treats hadronic interactions with the LANL version of the Oak Ridge code HETC. LAHET has now been expanded to handle hadrons with kinetic energies greater than 5 GeV with the FLUKA code, while HETC is used exclusively below 2.0 GeV. FLUKA is phased in linearly between 2.0 and 5.0 GeV. Transport of electrons and photons is done with EGS4, and an interface to the Los Alamos HMCNP3B library based code is provided to analyze neutrons with kinetic energies less than 20 MeV. Excellent agreement is found between the test data and simulation, and results for 2.46 GeV/c protons and pions are illustrated in this article.

  11. Radiation Protection Studies for Medical Particle Accelerators using Fluka Monte Carlo Code.

    Science.gov (United States)

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2017-04-01

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. © The Author 2016

  12. Design and spectrum calculation of 4H-SiC thermal neutron detectors using FLUKA and TCAD

    Science.gov (United States)

    Huang, Haili; Tang, Xiaoyan; Guo, Hui; Zhang, Yimen; Zhang, Yimeng; Zhang, Yuming

    2016-10-01

    SiC is a promising material for neutron detection in a harsh environment due to its wide band gap, high displacement threshold energy and high thermal conductivity. To increase the detection efficiency of SiC, a converter such as 6LiF or 10B is introduced. In this paper, pulse-height spectra of a PIN diode with a 6LiF conversion layer exposed to thermal neutrons (0.026 eV) are calculated using TCAD and Monte Carlo simulations. First, the conversion efficiency of a thermal neutron with respect to the thickness of 6LiF was calculated by using a FLUKA code, and a maximal efficiency of approximately 5% was achieved. Next, the energy distributions of both 3H and α induced by the 6LiF reaction according to different ranges of emission angle are analyzed. Subsequently, transient pulses generated by the bombardment of single 3H or α-particles are calculated. Finally, pulse height spectra are obtained with a detector efficiency of 4.53%. Comparisons of the simulated result with the experimental data are also presented, and the calculated spectrum shows an acceptable similarity to the experimental data. This work would be useful for radiation-sensing applications, especially for SiC detector design.

  13. Simulation Package based on Placet

    CERN Document Server

    D'Amico, T E; Leros, Nicolas; Schulte, Daniel

    2001-01-01

    The program PLACET is used to simulate transverse and longitudinal beam effects in the main linac, the drive-beam accelerator and the drive-beam decelerators of CLIC, as well as in the linac of CTF3. It provides different models of accelerating and decelerating structures, linear optics and thin multipoles. Several methods of beam-based alignment, including emittance tuning bumps and feedback, and different failure modes can be simulated. An interface to the beam-beam simulation code GUINEA-PIG exists. Currently, interfaces to MAD and TRANSPORT are under development and an extension to transfer lines and bunch compressors is also being made. In the future, the simulations will need to be performed by many users, which requires a simplified user interface. The paper describes the status of PLACET and plans for the futur

  14. Production of secondary particles and nuclei in cosmic rays collisions with the interstellar gas using the FLUKA code

    CERN Document Server

    Mazziotta, M N; Ferrari, A; Gaggero, D; Loparco, F; Sala, P R

    2016-01-01

    The measured fluxes of secondary particles produced by the interactions of Cosmic Rays (CRs) with the astronomical environment play a crucial role in understanding the physics of CR transport. In this work we present a comprehensive calculation of the secondary hadron, lepton, gamma-ray and neutrino yields produced by the inelastic interactions between several species of stable or long-lived cosmic rays projectiles (p, D, T, 3He, 4He, 6Li, 7Li, 9Be, 10Be, 10B, 11B, 12C, 13C, 14C, 14N, 15N, 16O, 17O, 18O, 20Ne, 24Mg and 28Si) and different target gas nuclei (p, 4He, 12C, 14N, 16O, 20Ne, 24Mg, 28Si and 40Ar). The yields are calculated using FLUKA, a simulation package designed to compute the energy distributions of secondary products with large accuracy in a wide energy range. The present results provide, for the first time, a complete and self-consistent set of all the relevant inclusive cross sections regarding the whole spectrum of secondary products in nuclear collisions. We cover, for the projectiles, a ki...

  15. The FLUKA Monte Carlo code coupled with the NIRS approach for clinical dose calculations in carbon ion therapy

    Science.gov (United States)

    Magro, G.; Dahle, T. J.; Molinelli, S.; Ciocca, M.; Fossati, P.; Ferrari, A.; Inaniwa, T.; Matsufuji, N.; Ytre-Hauge, K. S.; Mairani, A.

    2017-05-01

    Particle therapy facilities often require Monte Carlo (MC) simulations to overcome intrinsic limitations of analytical treatment planning systems (TPS) related to the description of the mixed radiation field and beam interaction with tissue inhomogeneities. Some of these uncertainties may affect the computation of effective dose distributions; therefore, particle therapy dedicated MC codes should provide both absorbed and biological doses. Two biophysical models are currently applied clinically in particle therapy: the local effect model (LEM) and the microdosimetric kinetic model (MKM). In this paper, we describe the coupling of the NIRS (National Institute for Radiological Sciences, Japan) clinical dose to the FLUKA MC code. We moved from the implementation of the model itself to its application in clinical cases, according to the NIRS approach, where a scaling factor is introduced to rescale the (carbon-equivalent) biological dose to a clinical dose level. A high level of agreement was found with published data by exploring a range of values for the MKM input parameters, while some differences were registered in forward recalculations of NIRS patient plans, mainly attributable to differences with the analytical TPS dose engine (taken as reference) in describing the mixed radiation field (lateral spread and fragmentation). We presented a tool which is being used at the Italian National Center for Oncological Hadrontherapy to support the comparison study between the NIRS clinical dose level and the LEM dose specification.

  16. Inversion based on computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  17. Polarized positrons for the ILC - update on simulations

    CERN Document Server

    Staufenbiel, F

    2012-01-01

    To achieve the extremely high luminosity for colliding electron-positron beams at the future International Linear Collider (ILC) an undulator-based source with about 230 meters helical undulator and a thin titanium-alloy target rim rotated with tangential velocity of about 100 meters per second are foreseen. The very high density of heat deposited in the target has to be analyzed carefully. The energy deposited by the photon beam in the target has been calculated in FLUKA. The resulting stress in the target material after one bunch train has been simulated in ANSYS.

  18. Fluka Raytracer

    OpenAIRE

    SIÑUELA PASTOR, DAVID

    2013-01-01

    The project described in this document was developed as the work for a technical student position at CERN in Geneva, Switzerland. CERN is the European Organization for Nuclear Research and is one of the leading physics laboratories across the world. The development of the current project was held by the section EN-STI-EET (Engineering; Sources Targets and Interactions; Emerging Energy Technologies), which in collaboration with INFN Institute, Italy is in charge of developing an...

  19. The FLUKA code for application of Monte Carlo methods to promote high precision ion beam therapy

    CERN Document Server

    Parodi, K; Cerutti, F; Ferrari, A; Mairani, A; Paganetti, H; Sommerer, F

    2010-01-01

    Monte Carlo (MC) methods are increasingly being utilized to support several aspects of commissioning and clinical operation of ion beam therapy facilities. In this contribution two emerging areas of MC applications are outlined. The value of MC modeling to promote accurate treatment planning is addressed via examples of application of the FLUKA code to proton and carbon ion therapy at the Heidelberg Ion Beam Therapy Center in Heidelberg, Germany, and at the Proton Therapy Center of Massachusetts General Hospital (MGH) Boston, USA. These include generation of basic data for input into the treatment planning system (TPS) and validation of the TPS analytical pencil-beam dose computations. Moreover, we review the implementation of PET/CT (Positron-Emission-Tomography / Computed- Tomography) imaging for in-vivo verification of proton therapy at MGH. Here, MC is used to calculate irradiation-induced positron-emitter production in tissue for comparison with the +-activity measurement in order to infer indirect infor...

  20. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; hide

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  1. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; Quadrelli, M.; Shakkotai, P.; Tso, K.

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  2. Simulation-based medical teaching and learning

    Directory of Open Access Journals (Sweden)

    Abdulmohsen H Al-Elq

    2010-01-01

    Full Text Available One of the most important steps in curriculum development is the introduction of simulation- based medical teaching and learning. Simulation is a generic term that refers to an artificial representation of a real world process to achieve educational goals through experiential learning. Simulation based medical education is defined as any educational activity that utilizes simulation aides to replicate clinical scenarios. Although medical simulation is relatively new, simulation has been used for a long time in other high risk professions such as aviation. Medical simulation allows the acquisition of clinical skills through deliberate practice rather than an apprentice style of learning. Simulation tools serve as an alternative to real patients. A trainee can make mistakes and learn from them without the fear of harming the patient. There are different types and classification of simulators and their cost vary according to the degree of their resemblance to the reality, or ′fidelity′. Simulation- based learning is expensive. However, it is cost-effective if utilized properly. Medical simulation has been found to enhance clinical competence at the undergraduate and postgraduate levels. It has also been found to have many advantages that can improve patient safety and reduce health care costs through the improvement of the medical provider′s competencies. The objective of this narrative review article is to highlight the importance of simulation as a new teaching method in undergraduate and postgraduate education.

  3. An interface for GEANT4 simulation using ROOT geometry navigation

    CERN Document Server

    Gheata, A

    2008-01-01

    The ROOT geometry modeller (TGeo) offers powerful tools for detector geometry description. The package provides several functionalities like: navigation, geometry checking, enhanced visualization, geometry editing GUI and many others, using ROOT I/O. A new interface module g4root was recently developed to take advantage of ROOT geometry navigation optimizations in the context of GEANT4 simulation. The interface can be used either by native GEANT4-based simulation applications or in the more general context of the Virtual Monte Carlo (VMC) framework developed by ALICE offline and ROOT teams. The latter allows running GEANT3, GEANT4 and FLUKA simulations without changing either the geometry description or the user code. The interface was tested and stressed in the context of ALICE simulation framework. A description of the interface, its usage as well as recent results in terms of reliability and performance will be presented. Some benchmarks will be compared for ROOT-TGeo or GEANT4 based navigation.

  4. Modelica-based TCP simulation

    Science.gov (United States)

    Velieva, T. R.; Eferina, E. G.; Korolkova, A. V.; Kulyabov, D. S.; Sevastianov, L. A.

    2017-01-01

    For the study and verification of our mathematical model of telecommunication systems a discrete simulation model and a continuous analytical model were developed. However, for various reasons, these implementations are not entirely satisfactory. It is necessary to develop a more adequate simulation model, possibly using a different modeling paradigm. In order to modeling of the TCP source it is proposed to use a hybrid (continuous-discrete) approach. For computer implementation of the model the physical modeling language Modelica is used. The hybrid approach allows us to take into account the transitions between different states in the continuous model of the TCP protocol. The considered approach allowed to obtain a simple simulation model of TCP source. This model has great potential for expansion. It is possible to implement different types of TCP.

  5. Simulation-based training for colonoscopy

    DEFF Research Database (Denmark)

    Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj;

    2015-01-01

    simulations-based modalities provided reliable and valid assessments of competence in colonoscopy and credible pass/fail standards were established for both the tests. We propose to use these standards in simulation-based training programs before proceeding to supervised training on patients....

  6. Simulation-based training for colonoscopy

    DEFF Research Database (Denmark)

    Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj

    2015-01-01

    The aim of this study was to create simulation-based tests with credible pass/fail standards for 2 different fidelities of colonoscopy models. Only competent practitioners should perform colonoscopy. Reliable and valid simulation-based tests could be used to establish basic competency in colonosc...

  7. Using FLUKA to Study Concrete Square Shield Performance in Attenuation of Neutron Radiation Produced by APF Plasma Focus Neutron Source

    Science.gov (United States)

    Nemati, M. J.; Habibi, M.; Amrollahi, R.

    2013-04-01

    In 2010, representatives from the Nuclear Engineering and physics Department of Amirkabir University of Technology (AUT) requested development of a project with the objective of determining the performance of a concrete shield for their Plasma Focus as neutron source. The project team in Laboratory of Nuclear Engineering and physics department of Amirkabir University of Technology choose some shape of shield to study on their performance with Monte Carlo code. In the present work, the capability of Monte Carlo code FLUKA will be explored to model the APF Plasma Focus, and investigating the neutron fluence on the square concrete shield in each region of problem. The physical models embedded in FLUKA are mentioned, as well as examples of benchmarking against future experimental data. As a result of this study suitable thickness of concrete for shielding APF will be considered.

  8. Simulation Platform: a cloud-based online simulation environment.

    Science.gov (United States)

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-09-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software.

  9. Reconsidering fidelity in simulation-based training.

    Science.gov (United States)

    Hamstra, Stanley J; Brydges, Ryan; Hatala, Rose; Zendejas, Benjamin; Cook, David A

    2014-03-01

    In simulation-based health professions education, the concept of simulator fidelity is usually understood as the degree to which a simulator looks, feels, and acts like a human patient. Although this can be a useful guide in designing simulators, this definition emphasizes technological advances and physical resemblance over principles of educational effectiveness. In fact, several empirical studies have shown that the degree of fidelity appears to be independent of educational effectiveness. The authors confronted these issues while conducting a recent systematic review of simulation-based health professions education, and in this Perspective they use their experience in conducting that review to examine key concepts and assumptions surrounding the topic of fidelity in simulation.Several concepts typically associated with fidelity are more useful in explaining educational effectiveness, such as transfer of learning, learner engagement, and suspension of disbelief. Given that these concepts more directly influence properties of the learning experience, the authors make the following recommendations: (1) abandon the term fidelity in simulation-based health professions education and replace it with terms reflecting the underlying primary concepts of physical resemblance and functional task alignment; (2) make a shift away from the current emphasis on physical resemblance to a focus on functional correspondence between the simulator and the applied context; and (3) focus on methods to enhance educational effectiveness using principles of transfer of learning, learner engagement, and suspension of disbelief. These recommendations clarify underlying concepts for researchers in simulation-based health professions education and will help advance this burgeoning field.

  10. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  11. Selected organ dose conversion coefficients for external photons calculated using ICRP adult voxel phantoms and Monte Carlo code FLUKA.

    Science.gov (United States)

    Patni, H K; Nadar, M Y; Akar, D K; Bhati, S; Sarkar, P K

    2011-11-01

    The adult reference male and female computational voxel phantoms recommended by ICRP are adapted into the Monte Carlo transport code FLUKA. The FLUKA code is then utilised for computation of dose conversion coefficients (DCCs) expressed in absorbed dose per air kerma free-in-air for colon, lungs, stomach wall, breast, gonads, urinary bladder, oesophagus, liver and thyroid due to a broad parallel beam of mono-energetic photons impinging in anterior-posterior and posterior-anterior directions in the energy range of 15 keV-10 MeV. The computed DCCs of colon, lungs, stomach wall and breast are found to be in good agreement with the results published in ICRP publication 110. The present work thus validates the use of FLUKA code in computation of organ DCCs for photons using ICRP adult voxel phantoms. Further, the DCCs for gonads, urinary bladder, oesophagus, liver and thyroid are evaluated and compared with results published in ICRP 74 in the above-mentioned energy range and geometries. Significant differences in DCCs are observed for breast, testis and thyroid above 1 MeV, and for most of the organs at energies below 60 keV in comparison with the results published in ICRP 74. The DCCs of female voxel phantom were found to be higher in comparison with male phantom for almost all organs in both the geometries.

  12. Base Camp Design Simulation Training

    Science.gov (United States)

    2011-07-01

    The Army needs officers and noncommissioned officers with requisite base camp competencies. The Army’s Field Manual (FM) 3-34.400 defines a Base Camp...reason, we designed a 600-man base camp on VBS2TM from an AutoCAD diagram found on the Theater Construction Management System (version 3.2). Known

  13. Production of secondary particles and nuclei in cosmic rays collisions with the interstellar gas using the FLUKA code

    Science.gov (United States)

    Mazziotta, M. N.; Cerutti, F.; Ferrari, A.; Gaggero, D.; Loparco, F.; Sala, P. R.

    2016-08-01

    The measured fluxes of secondary particles produced by the interactions of Cosmic Rays (CRs) with the astronomical environment play a crucial role in understanding the physics of CR transport. In this work we present a comprehensive calculation of the secondary hadron, lepton, gamma-ray and neutrino yields produced by the inelastic interactions between several species of stable or long-lived cosmic rays projectiles (p, D, T, 3He, 4He, 6Li, 7Li, 9Be, 10Be, 10B, 11B, 12C, 13C, 14C, 14N, 15N, 16O, 17O, 18O, 20Ne, 24Mg and 28Si) and different target gas nuclei (p, 4He, 12C, 14N, 16O, 20Ne, 24Mg, 28Si and 40Ar). The yields are calculated using FLUKA, a simulation package designed to compute the energy distributions of secondary products with large accuracy in a wide energy range. The present results provide, for the first time, a complete and self-consistent set of all the relevant inclusive cross sections regarding the whole spectrum of secondary products in nuclear collisions. We cover, for the projectiles, a kinetic energy range extending from 0.1 GeV/n up to 100 TeV/n in the lab frame. In order to show the importance of our results for multi-messenger studies about the physics of CR propagation, we evaluate the propagated spectra of Galactic secondary nuclei, leptons, and gamma rays produced by the interactions of CRs with the interstellar gas, exploiting the numerical codes DRAGON and GammaSky. We show that, adopting our cross section database, we are able to provide a good fit of a complete sample of CR observables, including: leptonic and hadronic spectra measured at Earth, the local interstellar spectra measured by Voyager, and the gamma-ray emissivities from Fermi-LAT collaboration. We also show a set of gamma-ray and neutrino full-sky maps and spectra.

  14. Simulation-based training for colonoscopy

    DEFF Research Database (Denmark)

    Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj;

    2015-01-01

    The aim of this study was to create simulation-based tests with credible pass/fail standards for 2 different fidelities of colonoscopy models.Only competent practitioners should perform colonoscopy. Reliable and valid simulation-based tests could be used to establish basic competency in colonosco...... reliable and valid assessments of competence in colonoscopy and credible pass/fail standards were established for both the tests. We propose to use these standards in simulation-based training programs before proceeding to supervised training on patients....

  15. New OTRA-Based Generalized Impedance Simulator

    OpenAIRE

    Ashish Gupta; Raj Senani; Bhaskar, D. R.; Singh, A. K.

    2013-01-01

    Operational transresistance amplifier (OTRA) has attracted considerable attention in the recent literature in several applications such as impedance simulation, universal biquad filter realization, realization of sinusoidal oscillators and multivibrators. However, to the best knowledge of the authors, any OTRA-based generalized impedance simulator circuits have not been reported so far. The purpose of this paper is to present such a circuit.

  16. Benchmark of the SixTrack-Fluka Active Coupling Against the SPS Scrapers Burst Test

    CERN Multimedia

    Mereghetti, A; Cerutti, F

    2014-01-01

    The SPS scrapers are a key ingredient for the clean injection into the LHC: they cut off halo particles quite close to the beam core (e.g.~3.5 sigma) just before extraction, to minimise the risk for quenches. The improved beam parameters as envisaged by the LHC Injectors Upgrade (LIU) Project required a revision of the present system, to assess its suitability and robustness. In particular, a burst (i.e. endurance) test of the scraper blades has been carried out, with the whole bunch train being scraped at the centre (worst working conditions). In order to take into account the effect of betatron and longitudinal beam dynamics on energy deposition patterns, and nuclear and Coulomb scattering in the absorbing medium onto loss patterns, the SixTrack and Fluka codes have been coupled, profiting from the best of the refined physical models they respectively embed. The coupling envisages an active exchange of tracked particles between the two codes at each turn, and an on-line aperture check in SixTrack, in order ...

  17. Residual activity evaluation: a benchmark between ANITA, FISPACT, FLUKA and PHITS codes

    Science.gov (United States)

    Firpo, Gabriele; Viberti, Carlo Maria; Ferrari, Anna; Frisoni, Manuela

    2017-09-01

    The activity of residual nuclides dictates the radiation fields in periodic inspections/repairs (maintenance periods) and dismantling operations (decommissioning phase) of accelerator facilities (i.e., medical, industrial, research) and nuclear reactors. Therefore, the correct prediction of the material activation allows for a more accurate planning of the activities, in line with the ALARA (As Low As Reasonably Achievable) principles. The scope of the present work is to show the results of a comparison between residual total specific activity versus a set of cooling time instants (from zero up to 10 years after irradiation) as obtained by two analytical (FISPACT and ANITA) and two Monte Carlo (FLUKA and PHITS) codes, making use of their default nuclear data libraries. A set of 40 irradiating scenarios is considered, i.e. neutron and proton particles of different energies, ranging from zero to many hundreds MeV, impinging on pure elements or materials of standard composition typically used in industrial applications (namely, AISI SS316 and Portland concrete). In some cases, experimental results were also available for a more thorough benchmark.

  18. Lattice-Boltzmann-based Simulations of Diffusiophoresis

    Science.gov (United States)

    Castigliego, Joshua; Kreft Pearce, Jennifer

    We present results from a lattice-Boltzmann-base Brownian Dynamics simulation on diffusiophoresis and the separation of particles within the system. A gradient in viscosity that simulates a concentration gradient in a dissolved polymer allows us to separate various types of particles by their deformability. As seen in previous experiments, simulated particles that have a higher deformability react differently to the polymer matrix than those with a lower deformability. Therefore, the particles can be separated from each other. This simulation, in particular, was intended to model an oceanic system where the particles of interest were zooplankton, phytoplankton and microplastics. The separation of plankton from the microplastics was achieved.

  19. A Carbonaceous Chondrite Based Simulant of Phobos

    Science.gov (United States)

    Rickman, Douglas L.; Patel, Manish; Pearson, V.; Wilson, S.; Edmunson, J.

    2016-01-01

    In support of an ESA-funded concept study considering a sample return mission, a simulant of the Martian moon Phobos was needed. There are no samples of the Phobos regolith, therefore none of the four characteristics normally used to design a simulant are explicitly known for Phobos. Because of this, specifications for a Phobos simulant were based on spectroscopy, other remote measurements, and judgment. A composition based on the Tagish Lake meteorite was assumed. The requirement that sterility be achieved, especially given the required organic content, was unusual and problematic. The final design mixed JSC-1A, antigorite, pseudo-agglutinates and gilsonite. Sterility was achieved by radiation in a commercial facility.

  20. Simulation-based certification for cataract surgery

    DEFF Research Database (Denmark)

    Thomsen, Ann Sofia Skou; Kiilgaard, Jens Folke; Kjaerbo, Hadi

    2015-01-01

    PURPOSE: To evaluate the EyeSi(™) simulator in regard to assessing competence in cataract surgery. The primary objective was to explore all simulator metrics to establish a proficiency-based test with solid evidence. The secondary objective was to evaluate whether the skill assessment was specific...... to cataract surgery. METHODS: We included 26 ophthalmic trainees (no cataract surgery experience), 11 experienced cataract surgeons (>4000 cataract procedures) and five vitreoretinal surgeons. All subjects completed 13 different modules twice. Simulator metrics were used for the assessments. RESULTS: Total...

  1. Scenario-based table top simulations

    DEFF Research Database (Denmark)

    Broberg, Ole; Edwards, Kasper; Nielsen, J.

    2012-01-01

    This study developed and tested a scenario-based table top simulation method in a user-driven innovation setting. A team of researchers worked together with a user group of five medical staff members from the existing clinic. Table top simulations of a new clinic were carried out in a simple model...... including patient scenarios, LEGO figures, shoeboxes, and cardboard. The results indicated that table top simulations is a simple, cheap and powerful tool to generate and test innovative conceptual solutions in the early stages of a design process....

  2. CORBA-Based Discrete Event Simulation System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The CORBA technique is an integration of the object-oriented conception and distributed computing technique. It can make the application within distributed heterogeneous environments reusable, portable and interoperable.The architecture of CORBA-based discrete event simulation systems is presented and the interface of distributed simulation objects (DSO) is defined in this paper after the DSO is identified and the sysnchronization mechanism among DSO is discussed.``

  3. Agent-based simulation of animal behaviour

    OpenAIRE

    Jonker, C.M.; Treur, J.

    1998-01-01

    In this paper it is shown how animal behaviour can be simulated in an agent-based manner. Different models are shown for different types of behaviour, varying from purely reactive behaviour to pro-active, social and adaptive behaviour. The compositional development method for multi-agent systems DESIRE and its software environment supports the conceptual and detailed design, and execution of these models. Experiments reported in the literature on animal behaviour have been simulated for a num...

  4. Simulation-based training for thoracoscopy

    DEFF Research Database (Denmark)

    Bjurström, Johanna Margareta; Konge, Lars; Lehnert, Per;

    2013-01-01

    An increasing proportion of thoracic procedures are performed using video-assisted thoracic surgery. This minimally invasive technique places special demands on the surgeons. Using simulation-based training on artificial models or animals has been proposed to overcome the initial part of the lear......An increasing proportion of thoracic procedures are performed using video-assisted thoracic surgery. This minimally invasive technique places special demands on the surgeons. Using simulation-based training on artificial models or animals has been proposed to overcome the initial part...... of the learning curve. This study aimed to investigate the effect of simulation-based training and to compare self-guided and educator-guided training....

  5. Simulation-based training for thoracoscopic lobectomy

    DEFF Research Database (Denmark)

    Jensen, Katrine; Ringsted, Charlotte; Hansen, Henrik Jessen;

    2014-01-01

    BACKGROUND: Video-assisted thoracic surgery is gradually replacing conventional open thoracotomy as the method of choice for the treatment of early-stage non-small cell lung cancers, and thoracic surgical trainees must learn and master this technique. Simulation-based training could help trainees...... overcome the first part of the learning curve, but no virtual-reality simulators for thoracoscopy are commercially available. This study aimed to investigate whether training on a laparoscopic simulator enables trainees to perform a thoracoscopic lobectomy. METHODS: Twenty-eight surgical residents were...... randomized to either virtual-reality training on a nephrectomy module or traditional black-box simulator training. After a retention period they performed a thoracoscopic lobectomy on a porcine model and their performance was scored using a previously validated assessment tool. RESULTS: The groups did...

  6. Simulation-based Testing of Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olama, Mohammed M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-10

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulator can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.

  7. Agent-based simulation of animal behaviour

    NARCIS (Netherlands)

    C.M. Jonker (Catholijn); J. Treur

    1998-01-01

    textabstract In this paper it is shown how animal behaviour can be simulated in an agent-based manner. Different models are shown for different types of behaviour, varying from purely reactive behaviour to pro-active, social and adaptive behaviour. The compositional development method for

  8. 2D PIM Simulation Based on COMSOL

    DEFF Research Database (Denmark)

    Wang, Xinbo; Cui, Wanzhao; Wang, Jingyu;

    2011-01-01

    Passive intermodulation (PIM) is a problematic type of nonlinear distortion en- countered in many communication systems. To analyze the PIM distortion resulting from ma- terial nonlinearity, a 2D PIM simulation method based on COMSOL is proposed in this paper. As an example, a rectangular wavegui...

  9. Simulation and case-based learning

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Guralnick, David

    2008-01-01

    Abstract- This paper has its origin in the authors' reflection on years of practical experiences combined with literature readings in our preparation for a workshop on learn-by-doing simulation and case-based learning to be held at the ICELW 2008 conference (the International Conference on E...

  10. Accelerated GPU based SPECT Monte Carlo simulations

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency

  11. Accelerated GPU based SPECT Monte Carlo simulations.

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational

  12. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  13. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation

    DEFF Research Database (Denmark)

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki

    2017-01-01

    simulations. DISCUSSION: Non-randomised studies argue that in situ simulation is more effective for educational purposes than other types of simulation settings. Conversely, the few comparison studies that exist, either randomised or retrospective, show that choice of setting does not seem to influence......BACKGROUND: Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities...... that choice of setting for simulations does not seem to influence individual and team learning. Department-based local simulation, such as simulation in-house and especially in situ simulation, leads to gains in organisational learning. The overall objectives of simulation-based education and factors...

  14. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation

    DEFF Research Database (Denmark)

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki

    2017-01-01

    BACKGROUND: Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities...... that choice of setting for simulations does not seem to influence individual and team learning. Department-based local simulation, such as simulation in-house and especially in situ simulation, leads to gains in organisational learning. The overall objectives of simulation-based education and factors...... are called in-house training. In-house training facilities can be part of hospital departments and resemble to some extent simulation centres but often have less technical equipment. In situ simulation, introduced over the past decade, mainly comprises of team-based activities and occurs in patient care...

  15. Simulation-Based Abdominal Ultrasound Training

    DEFF Research Database (Denmark)

    Østergaard, Mikkel; Ewertsen, C; Konge, L;

    2016-01-01

    PURPOSE: The aim is to provide a complete overview of the different simulation-based training options for abdominal ultrasound and to explore the evidence of their effect. MATERIALS AND METHODS: This systematic review was performed according to the PRISMA guidelines and Medline, Embase, Web...... of Science, and the Cochrane Library was searched. Articles were divided into three categories based on study design (randomized controlled trials, before-and-after studies and descriptive studies) and assessed for level of evidence using the Oxford Centre for Evidence Based Medicine (OCEBM) system...

  16. A web-based virtual lighting simulator

    Energy Technology Data Exchange (ETDEWEB)

    Papamichael, Konstantinos; Lai, Judy; Fuller, Daniel; Tariq, Tara

    2002-05-06

    This paper is about a web-based ''virtual lighting simulator,'' which is intended to allow architects and lighting designers to quickly assess the effect of key parameters on the daylighting and lighting performance in various space types. The virtual lighting simulator consists of a web-based interface that allows navigation through a large database of images and data, which were generated through parametric lighting simulations. At its current form, the virtual lighting simulator has two main modules, one for daylighting and one for electric lighting. The daylighting module includes images and data for a small office space, varying most key daylighting parameters, such as window size and orientation, glazing type, surface reflectance, sky conditions, time of the year, etc. The electric lighting module includes images and data for five space types (classroom, small office, large open office, warehouse and small retail), varying key lighting parameters, such as the electric lighting system, surface reflectance, dimming/switching, etc. The computed images include perspectives and plans and are displayed in various formats to support qualitative as well as quantitative assessment. The quantitative information is in the form of iso-contour lines superimposed on the images, as well as false color images and statistical information on work plane illuminance. The qualitative information includes images that are adjusted to account for the sensitivity and adaptation of the human eye. The paper also includes a section on the major technical issues and their resolution.

  17. Simulation-based design for infrastructure system simulation

    NARCIS (Netherlands)

    Fumarola, M.; Huang, Y.; Tekinay, C.; Seck, M.D.

    2010-01-01

    Simulation models are often used to analyze the behavior and performance of infrastructure systems. The use of simulation models in multi-actor design processes is restricted to the analysis phase after conceptual designs have been completed. To use simulation models throughout the design process, s

  18. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation

    NARCIS (Netherlands)

    Sorensen, J.L.; Ostergaard, D.; Leblanc, V.; Ottesen, B.; Konge, L.; Dieckmann, P.; Vleuten, C. van der

    2017-01-01

    BACKGROUND: Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities

  19. A PDE-based partial discharge simulator

    Science.gov (United States)

    Villa, Andrea; Barbieri, Luca; Gondola, Marco; Leon-Garzon, Andres R.; Malgesini, Roberto

    2017-09-01

    Partial discharges are the main ageing and failure mechanism of solid insulating materials subjected to alternated current stresses. This phenomenon, from a simulation point of view, has been almost always tackled using semi-empirical schemes. In this work, a fully physically-based model, based on a set of conservation partial differential equations, is introduced. A numerical algorithm, specifically designed to solve this particular problem, is developed and its validation is discussed considering some experimental data acquired in a simple geometry containing an isolated void.

  20. Collaborative virtual experience based on reconfigurable simulation

    Science.gov (United States)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong

    2006-10-01

    Virtual Reality simulation enables immersive 3D experience of a Virtual Environment. A simulation-based Virtual Environment can be used to map real world phenomena onto virtual experience. With a reconfigurable simulation, users can reconfigure the parameters of the involved objects, so that they can see different effects from the different configurations. This concept is suitable for a classroom learning of physics law. This research studies the Virtual Reality simulation of Newton's physics law on rigid body type of objects. With network support, collaborative interaction is enabled so that people from different places can interact with the same set of objects in immersive Collaborative Virtual Environment. The taxonomy of the interaction in different levels of collaboration is described as: distinct objects and same object, in which there are same object - sequentially, same object - concurrently - same attribute, and same object - concurrently - distinct attributes. The case studies are the interaction of users in two cases: destroying and creating a set of arranged rigid bodies. In Virtual Domino, users can observe physics law while applying force to the domino blocks in order to destroy the arrangements. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects.

  1. Collaborative Simulation Run-time Management Environment Based on HLA

    Institute of Scientific and Technical Information of China (English)

    王江云; 柴旭东; 王行仁

    2002-01-01

    The Collaborative Simulation Run-time Management Environment based on HLA (CSRME) mainly focuses on simulation problems for the system design of the complex distributed simulation. CSRME can integrate all the simulation tools and simulation applications that comply with the well-documented interface standards defined by CSRME. CSRME supports both the interoperability of different simulations and the integration of simulation tools, as well as provides simulation run-time management, simulation time management and simulation data management. Finally, the distributed command training system is analyzed and realized to validate the theories of CSRME.

  2. Radiation simulations of the CMS detector

    Science.gov (United States)

    Stoddard, Graham J.

    This thesis presents results of recent radiation simulations for the Compact Muon Solenoid detector at the Large Hadron Collider at CERN performed using the Monte Carlo simulation package FLUKA. High statistics simulations with a fine granularity in the detector were carried out using the Condor batch system at the Fermilab LHC Physics Center. In addition, an existing web tool for accessing and displaying simulation data was upgraded. The FLUKA data and previously generated MARS Monte Carlo data can be plotted using 1-D or 2-D plotting functionalities along R and Z, the transverse distance from the beamline and the distance along the beamline, respectively. Comparisons between the data sets have been carried out; the effect of particle transport thresholds in both packages was explored, comparisons with zero magnetic field in the CMS solenoid and full field are made, a model of non-ionizing energy losses is examined, and sensitive areas of interest within the simulation are identified.

  3. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  4. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  5. SIMULATION OF SUBGRADE EMBANKMENT ON WEAK BASE

    Directory of Open Access Journals (Sweden)

    V. D. Petrenko

    2015-08-01

    Full Text Available Purpose. This article provides: the question of the sustainability of the subgrade on a weak base is considered in the paper. It is proposed to use the method of jet grouting. Investigation of the possibility of a weak base has an effect on the overall deformation of the subgrade; the identification and optimization of the parameters of subgrade based on studies using numerical simulation. Methodology. The theoretical studies of the stress-strain state of the base and subgrade embankment by modeling in the software package LIRA have been conducted to achieve this goal. Findings. After making the necessary calculations perform building fields of a subsidence, borders cramped thickness, bed’s coefficients of Pasternak and Winkler. The diagrams construction of vertical stress performs at any point of load application. Also, using the software system may perform peer review subsidence, rolls railroad tracks in natural and consolidated basis. Originality. For weak soils is the most appropriate nonlinear model of the base with the existing areas of both elastic and limit equilibrium, mixed problem of the theory of elasticity and plasticity. Practical value. By increasing the load on the weak base as a result of the second track construction, adds embankment or increasing axial load when changing the rolling stock process of sedimentation and consolidation may continue again. Therefore, one of the feasible and promising options for the design and reconstruction of embankments on weak bases is to strengthen the bases with the help of jet grouting. With the expansion of the railway infrastructure, increasing speed and weight of the rolling stock is necessary to ensure the stability of the subgrade on weak bases. LIRA software package allows you to perform all the necessary calculations for the selection of a proper way of strengthening weak bases.

  6. Simulation Based Earthquake Forecasting with RSQSim

    Science.gov (United States)

    Gilchrist, J. J.; Jordan, T. H.; Dieterich, J. H.; Richards-Dinger, K. B.

    2016-12-01

    We are developing a physics-based forecasting model for earthquake ruptures in California. We employ the 3D boundary element code RSQSim to generate synthetic catalogs with millions of events that span up to a million years. The simulations incorporate rate-state fault constitutive properties in complex, fully interacting fault systems. The Unified California Earthquake Rupture Forecast Version 3 (UCERF3) model and data sets are used for calibration of the catalogs and specification of fault geometry. Fault slip rates match the UCERF3 geologic slip rates and catalogs are tuned such that earthquake recurrence matches the UCERF3 model. Utilizing the Blue Waters Supercomputer, we produce a suite of million-year catalogs to investigate the epistemic uncertainty in the physical parameters used in the simulations. In particular, values of the rate- and state-friction parameters a and b, the initial shear and normal stress, as well as the earthquake slip speed, are varied over several simulations. In addition to testing multiple models with homogeneous values of the physical parameters, the parameters a, b, and the normal stress are varied with depth as well as in heterogeneous patterns across the faults. Cross validation of UCERF3 and RSQSim is performed within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM) to determine the affect of the uncertainties in physical parameters observed in the field and measured in the lab, on the uncertainties in probabilistic forecasting. We are particularly interested in the short-term hazards of multi-event sequences due to complex faulting and multi-fault ruptures.

  7. Sea battle-filed simulation based on Vega

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jing; CHEN Jie; GUO Mao-zu

    2009-01-01

    To study battle-field simulation methods based on Vega, a virtual battle-field simulated by an imaginary combat happened on the sea was designed. The simulation framework in the sea battle-filed included helicopter simulation, fire simulation, collision detection and detonation, and simulation of dynamic sea surface.The method to build the imulation environments and actions to them was discussed. And the simulation experiments were conducted. , It is indicated that the simulated sea battle-field based on Vega is feasible and helpful for forces and battle-field.

  8. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation.

    Science.gov (United States)

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki; Ottesen, Bent; Konge, Lars; Dieckmann, Peter; Van der Vleuten, Cees

    2017-01-21

    Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities are called in-house training. In-house training facilities can be part of hospital departments and resemble to some extent simulation centres but often have less technical equipment. In situ simulation, introduced over the past decade, mainly comprises of team-based activities and occurs in patient care units with healthcare professionals in their own working environment. Thus, this intentional blend of simulation and real working environments means that in situ simulation brings simulation to the real working environment and provides training where people work. In situ simulation can be either announced or unannounced, the latter also known as a drill. This article presents and discusses the design of SBME and the advantage and disadvantage of the different simulation settings, such as training in simulation-centres, in-house simulations in hospital departments, announced or unannounced in situ simulations. Non-randomised studies argue that in situ simulation is more effective for educational purposes than other types of simulation settings. Conversely, the few comparison studies that exist, either randomised or retrospective, show that choice of setting does not seem to influence individual or team learning. However, hospital department-based simulations, such as in-house simulation and in situ simulation, lead to a gain in organisational learning. To our knowledge no studies have compared announced and unannounced in situ simulation. The literature suggests some improved organisational learning from unannounced in situ simulation; however, unannounced in situ simulation was also found to be challenging to plan and conduct, and more stressful among participants. The importance of

  9. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  10. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  11. Simulation based virtual learning environment in medical genetics counseling

    DEFF Research Database (Denmark)

    Makransky, Guido; Bonde, Mads T.; Wulff, Julie S. G.

    2016-01-01

    BACKGROUND: Simulation based learning environments are designed to improve the quality of medical education by allowing students to interact with patients, diagnostic laboratory procedures, and patient data in a virtual environment. However, few studies have evaluated whether simulation based lea...

  12. Agent-Based Simulations for Project Management

    Science.gov (United States)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  13. Response of a BGO detector to photon and neutron sources simulations and measurements

    CERN Document Server

    Vincke, H H; Fabjan, Christian Wolfgang; Otto, T

    2002-01-01

    In this paper Monte Carlo simulations (FLUKA) and measurements of the response of a BGO detector are reported. %For the measurements different radioactive sources were used to irradiate the BGO crystal. For the measurements three low-energy photon emitters $\\left({}^{60}\\rm{Co},\\right.$ ${}^{54}\\rm{Mn},$ $\\left. {}^{137}\\rm{Cs}\\right)$ were used to irradiate the BGO from various distances and angles. The neutron response was measured with an Am--Be neutron source. Simulations of the experimental irradiations were carried out. Our study can also be considered as a benchmark for FLUKA in terms of its reliability to predict the detector response of a BGO scintillator.

  14. Airway management in a bronchoscopic simulator based setting

    DEFF Research Database (Denmark)

    Graeser, Karin; Konge, Lars; Kristensen, Michael S

    2014-01-01

    BACKGROUND: Several simulation-based possibilities for training flexible optical intubation have been developed, ranging from non-anatomical phantoms to high-fidelity virtual reality simulators. These teaching devices might also be used to assess the competence of trainees before allowing them....... The anaesthetists in our study agreed completely that simulation-based training was useful regardless of the fidelity of the simulator. Local, practical issues such as cost and portability should decide available simulation modalities in each teaching hospital....

  15. Interfacing MCNPX and McStas for simulation of neutron transport

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Lauritzen, Bent; Nonbøl, Erik

    2013-01-01

    Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX[1] or FLUKA[2, 3] whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as Mc...

  16. Cloud GPU-based simulations for SQUAREMR

    Science.gov (United States)

    Kantasis, George; Xanthis, Christos G.; Haris, Kostas; Heiberg, Einar; Aletras, Anthony H.

    2017-01-01

    Quantitative Magnetic Resonance Imaging (MRI) is a research tool, used more and more in clinical practice, as it provides objective information with respect to the tissues being imaged. Pixel-wise T1 quantification (T1 mapping) of the myocardium is one such application with diagnostic significance. A number of mapping sequences have been developed for myocardial T1 mapping with a wide range in terms of measurement accuracy and precision. Furthermore, measurement results obtained with these pulse sequences are affected by errors introduced by the particular acquisition parameters used. SQUAREMR is a new method which has the potential of improving the accuracy of these mapping sequences through the use of massively parallel simulations on Graphical Processing Units (GPUs) by taking into account different acquisition parameter sets. This method has been shown to be effective in myocardial T1 mapping; however, execution times may exceed 30 min which is prohibitively long for clinical applications. The purpose of this study was to accelerate the construction of SQUAREMR's multi-parametric database to more clinically acceptable levels. The aim of this study was to develop a cloud-based cluster in order to distribute the computational load to several GPU-enabled nodes and accelerate SQUAREMR. This would accommodate high demands for computational resources without the need for major upfront equipment investment. Moreover, the parameter space explored by the simulations was optimized in order to reduce the computational load without compromising the T1 estimates compared to a non-optimized parameter space approach. A cloud-based cluster with 16 nodes resulted in a speedup of up to 13.5 times compared to a single-node execution. Finally, the optimized parameter set approach allowed for an execution time of 28 s using the 16-node cluster, without compromising the T1 estimates by more than 10 ms. The developed cloud-based cluster and optimization of the parameter set reduced

  17. Simulation based engineering in solid mechanics

    CERN Document Server

    Rao, J S

    2017-01-01

    This book begins with a brief historical perspective of the advent of rotating machinery in 20th century Solid Mechanics and the development of the discipline of the Strength of Materials. High Performance Computing (HPC) and Simulation Based Engineering Science (SBES) have gradually replaced the conventional approach in Design bringing science directly into engineering without approximations. A recap of the required mathematical principles is given. The science of deformation, strain and stress at a point under the application of external traction loads is next presented. Only one-dimensional structures classified as Bars (axial loads), Rods (twisting loads) and Beams (bending loads) are considered in this book. The principal stresses and strains and von Mises stress and strain that used in design of structures are next presented. Lagrangian solution was used to derive the governing differential equations consistent with assumed deformation field and solution for deformations, strains and stresses were obtai...

  18. Cycle-Based Algorithm Used to Accelerate VHDL Simulation

    Institute of Scientific and Technical Information of China (English)

    杨勋; 刘明业

    2000-01-01

    Cycle-based algorithm has very high performance for the simula-tion of synchronous design, but it is confined to synchronous design and it is not as accurate as event-driven algorithm. In this paper, a revised cycle-based algorithm is proposed and implemented in VHDL simulator. Event-driven simulation engine and cycle-based simulation engine have been imbedded in the same simulation environ-ment and can be used to asynchronous design and synchronous design respectively. Thus the simulation performance is improved without losing the flexibility and ac-curacy of event-driven algorithm.

  19. A rainfall simulator based on multifractal generator

    Science.gov (United States)

    Akrour, Nawal; mallet, Cecile; barthes, Laurent; chazottes, Aymeric

    2015-04-01

    illustrating the simulator's capabilities will be provided. They show that the simulated two-dimensional fields have coherent statistical properties in term of cumulative rain rate distribution but also in term of power spectrum and structure function with the observed ones at different spatial scales (1, 4, 16 km2) involving that scale features are well represented by the model. Keywords: precipitation, multifractal modeling, variogram, structure function, scale invariance, rain intermittency Akrour, N., Aymeric; C., Verrier, S., Barthes, L., Mallet, C.: 2013. Calibrating synthetic multifractal times series with observed data. International Precipitation Conference (IPC 11), Wageningen, The Netherlands http://www.wageningenur.nl/upload_mm/7/5/e/a72f004a-8e66-445c-bb0b-f489ed0ff0d4_Abstract%20book_TotaalLR-SEC.pdf Akrour, N., Aymeric; C., Verrier, S., Mallet, C., Barthes, L.: 2014: Simulation of yearly rainfall time series at micro-scale resolution with actual properties: intermittency, scale invariance, rainfall distribution, submitted to Water Resources Research (under revision) Schertzer, D., S. Lovejoy, 1987: Physically based rain and cloud modeling by anisotropic, multiplicative turbulent cascades. J. Geophys. Res. 92, 9692-9714 Schleiss, M., S. Chamoun, and A. Berne (2014), Stochastic simulation of intermittent rainfall using the concept of dry drift, Water Resources Research, 50 (3), 2329-2349

  20. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  1. Keystream Generator Based On Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Ayad A. Abdulsalam

    2011-01-01

    Full Text Available Advances in the design of keystream generator using heuristic techniques are reported. A simulated annealing algorithm for generating random keystream with large complexity is presented. Simulated annealing technique is adapted to locate these requirements. The definitions for some cryptographic properties are generalized, providing a measure suitable for use as an objective function in a simulated annealing algorithm, seeking randomness that satisfy both correlation immunity and the large linear complexity. Results are presented demonstrating the effectiveness of the method.

  2. Simulation-Based System Design Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The research objective is to develop, test, and implement effective and efficient simulation techniques for modeling, evaluating, and optimizing systems in order to...

  3. Relaxation Based Electrical Simulation for VLSI Circuits

    Directory of Open Access Journals (Sweden)

    S. Rajkumar

    2012-06-01

    Full Text Available Electrical circuit simulation was one of the first CAD tools developed for IC design. The conventional circuit simulators like SPICE and ASTAP were designed initially for the cost effective analysis of circuits containing a few hundred transistors or less. A number of approaches have been used to improve the performances of congenital circuit simulators for the analysis of large circuits. Thereafter relaxation methods was proposed to provide more accurate waveforms than standard circuit simulators with up to two orders of magnitude speed improvement for large circuits. In this paper we have tried to highlights recently used waveform and point relaxation techniques for simulation of VLSI circuits. We also propose a simple parallelization technique and experimentally demonstrate that we can solve digital circuits with tens of million transistors in a few hours.

  4. The FLUKA Monte Carlo code coupled with the local effect model for biological calculations in carbon ion therapy

    CERN Document Server

    Mairani, A; Kraemer, M; Sommerer, F; Parodi, K; Scholz, M; Cerutti, F; Ferrari, A; Fasso, A

    2010-01-01

    Clinical Monte Carlo (MC) calculations for carbon ion therapy have to provide absorbed and RBE-weighted dose. The latter is defined as the product of the dose and the relative biological effectiveness (RBE). At the GSI Helmholtzzentrum fur Schwerionenforschung as well as at the Heidelberg Ion Therapy Center (HIT), the RBE values are calculated according to the local effect model (LEM). In this paper, we describe the approach followed for coupling the FLUKA MC code with the LEM and its application to dose and RBE-weighted dose calculations for a superimposition of two opposed C-12 ion fields as applied in therapeutic irradiations. The obtained results are compared with the available experimental data of CHO (Chinese hamster ovary) cell survival and the outcomes of the GSI analytical treatment planning code TRiP98. Some discrepancies have been observed between the analytical and MC calculations of absorbed physical dose profiles, which can be explained by the differences between the laterally integrated depth-d...

  5. A Monte Carlo transport code study of the space radiation environment using FLUKA and ROOT

    CERN Document Server

    Wilson, T; Carminati, F; Brun, R; Ferrari, A; Sala, P; Empl, A; MacGibbon, J

    2001-01-01

    We report on the progress of a current study aimed at developing a state-of-the-art Monte-Carlo computer simulation of the space radiation environment using advanced computer software techniques recently available at CERN, the European Laboratory for Particle Physics in Geneva, Switzerland. By taking the next-generation computer software appearing at CERN and adapting it to known problems in the implementation of space exploration strategies, this research is identifying changes necessary to bring these two advanced technologies together. The radiation transport tool being developed is tailored to the problem of taking measured space radiation fluxes impinging on the geometry of any particular spacecraft or planetary habitat and simulating the evolution of that flux through an accurate model of the spacecraft material. The simulation uses the latest known results in low-energy and high-energy physics. The output is a prediction of the detailed nature of the radiation environment experienced in space as well a...

  6. Conceptual Modelling for Simulation-Based Serious Gaming

    NARCIS (Netherlands)

    van der Zee, D.J.; Holkenborg, B.; Johansson, B; Jain, S; MontoyaTorres, J; Hugan, J; Yucesan, E

    2010-01-01

    In recent years several simulation-based serious games have been developed for mastering new business concepts in operations management. This indicates the high potential of simulation use for pedagogical purposes. Unfortunately, this potential is hardly reflected in simulation methodology. We consi

  7. Spatial distribution sampling and Monte Carlo simulation of radioactive isotopes

    CERN Document Server

    Krainer, Alexander Michael

    2015-01-01

    This work focuses on the implementation of a program for random sampling of uniformly spatially distributed isotopes for Monte Carlo particle simulations and in specific FLUKA. With FLUKA it is possible to calculate the radio nuclide production in high energy fields. The decay of these nuclide, and therefore the resulting radiation field, however can only be simulated in the same geometry. This works gives the tool to simulate the decay of the produced nuclide in other geometries. With that the radiation field from an irradiated object can be simulated in arbitrary environments. The sampling of isotope mixtures was tested by simulating a 50/50 mixture of $Cs^{137}$ and $Co^{60}$. These isotopes are both well known and provide therefore a first reliable benchmark in that respect. The sampling of uniformly distributed coordinates was tested using the histogram test for various spatial distributions. The advantages and disadvantages of the program compared to standard methods are demonstrated in the real life ca...

  8. Mosquito population dynamics from cellular automata-based simulation

    Science.gov (United States)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  9. Traffic and Driving Simulator Based on Architecture of Interactive Motion.

    Science.gov (United States)

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  10. Traffic and Driving Simulator Based on Architecture of Interactive Motion

    Directory of Open Access Journals (Sweden)

    Alexander Paz

    2015-01-01

    Full Text Available This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i motion-based driving simulation, (ii pedestrian simulation, (iii motorcycling and bicycling simulation, and (iv traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  11. Observer-based Satellite Attitude Control and Simulation Researches

    Institute of Scientific and Technical Information of China (English)

    王子才; 马克茂

    2002-01-01

    Observer design method is applied to the realization of satellite attitude control law baaed on simplified control model. Exact mathematical model of the satellite attitude control system is also constructed, together with the observer-based control law, to conduct simulation research. The simulation results justify the effectiveness andfeasibility of the observer-based control method.

  12. Assessment of Clinical Competence: Written and Computer-Based Simulations.

    Science.gov (United States)

    Swanson, David B.; And Others

    1987-01-01

    Literature concerning the validity and reliability of both written and computer-based simulations in assessing clinical competence in the health professions is reviewed, and suggestions are given for the improvement of the psychometric qualities of simulation-based tests. (MSE)

  13. Budget Time: A Gender-Based Negotiation Simulation

    Science.gov (United States)

    Barkacs, Linda L.; Barkacs, Craig B.

    2017-01-01

    This article presents a gender-based negotiation simulation designed to make participants aware of gender-based stereotypes and their effect on negotiation outcomes. In this simulation, the current research on gender issues is animated via three role sheets: (a) Vice president (VP), (b) advantaged department head, and (c) disadvantaged department…

  14. Performance optimization of web-based medical simulation.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2013-01-01

    This paper presents a technique for performance optimization of multimodal interactive web-based medical simulation. A web-based simulation framework is promising for easy access and wide dissemination of medical simulation. However, the real-time performance of the simulation highly depends on hardware capability on the client side. Providing consistent simulation in different hardware is critical for reliable medical simulation. This paper proposes a non-linear mixed integer programming model to optimize the performance of visualization and physics computation while considering hardware capability and application specific constraints. The optimization model identifies and parameterizes the rendering and computing capabilities of the client hardware using an exploratory proxy code. The parameters are utilized to determine the optimized simulation conditions including texture sizes, mesh sizes and canvas resolution. The test results show that the optimization model not only achieves a desired frame per second but also resolves visual artifacts due to low performance hardware.

  15. Agent-based Simulation of the Maritime Domain

    Directory of Open Access Journals (Sweden)

    O. Vaněk

    2010-01-01

    Full Text Available In this paper, a multi-agent based simulation platform is introduced that focuses on legitimate and illegitimate aspects of maritime traffic, mainly on intercontinental transport through piracy afflicted areas. The extensible architecture presented here comprises several modules controlling the simulation and the life-cycle of the agents, analyzing the simulation output and visualizing the entire simulated domain. The simulation control module is initialized by various configuration scenarios to simulate various real-world situations, such as a pirate ambush, coordinated transit through a transport corridor, or coastal fishing and local traffic. The environmental model provides a rich set of inputs for agents that use the geo-spatial data and the vessel operational characteristics for their reasoning. The agent behavior model based on finite state machines together with planning algorithms allows complex expression of agent behavior, so the resulting simulation output can serve as a substitution for real world data from the maritime domain.

  16. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  17. Issues of Simulation-Based Route Assignment

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, K.; Rickert, M.

    1999-07-20

    The authors use an iterative re-planning scheme with simulation feedback to generate a self-consistent route-set for a given street network and origin-destination matrix. The iteration process is defined by three parameters. They found that they have influence on the speed of the relaxation, but not necessarily on its final state.

  18. Agent-based simulation of animal behaviour

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    2001-01-01

    In the biological literature on animal behaviour, in addition to real experiments and field studies, also simulation experiments are a useful source of progress. Often specific mathematical modelling techniques are adopted and directly implemented in a programming language. Modelling more complex ag

  19. Preview-based sampling for controlling gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2011-01-01

    In this work, we describe an automated method for directing the control of a high resolution gaseous fluid simulation based on the results of a lower resolution preview simulation. Small variations in accuracy between low and high resolution grids can lead to divergent simulations, which is problematic for those wanting to achieve a desired behavior. Our goal is to provide a simple method for ensuring that the high resolution simulation matches key properties from the lower resolution simulation. We first let a user specify a fast, coarse simulation that will be used for guidance. Our automated method samples the data to be matched at various positions and scales in the simulation, or allows the user to identify key portions of the simulation to maintain. During the high resolution simulation, a matching process ensures that the properties sampled from the low resolution simulation are maintained. This matching process keeps the different resolution simulations aligned even for complex systems, and can ensure consistency of not only the velocity field, but also advected scalar values. Because the final simulation is naturally similar to the preview simulation, only minor controlling adjustments are needed, allowing a simpler control method than that used in prior keyframing approaches. Copyright © 2011 by the Association for Computing Machinery, Inc.

  20. Simulation of Screening Process Based on MATLAB/Simulink

    Institute of Scientific and Technical Information of China (English)

    YANG Ying-jie; DENG Hui-yong; LI Xia

    2006-01-01

    Screening is an important process in mineral industry. In this paper, a study has been made to simulate the screening process based on a high-performance MATLAB/Simulink software, with an example of simulating the sieving process of a vibrating screen. A simulation model of the sieving process with a vibrating screen (SMSPVS) was proposed, using correlative mathematical models and Simulink blocks. The results show that the simulation data was very close to the actual data, The minimum errors of size distribution of oversize and undersize are 0.65% and 0.20%, respectively. The sieving process can be accurately simulated by the SMSPVS.

  1. Simulation Based Optimization for World Line Card Production System

    Directory of Open Access Journals (Sweden)

    Sinan APAK

    2012-07-01

    Full Text Available Simulation based decision support system is one of the commonly used tool to examine complex production systems. The simulation approach provides process modules which can be adjusted with certain parameters by using data relatively easily obtainable in production process. World Line Card production system simulation is developed to evaluate the optimality of existing production line via using discrete event simulation model with variaty of alternative proposals. The current production system is analysed by a simulation model emphasizing the bottlenecks and the poorly utilized production line. Our analysis identified some improvements and efficient solutions for the existing system.

  2. GIS-Based Simulation of Engineering Construction Schedule

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jing; ZHONG Denghua; HU Chengshun

    2005-01-01

    For its complexity, engineering construction schedule design is limited by various factors.Simulation-based engineering construction schedule takes critical path method (CPM) network as frame and calls complex cyclic operation network (CYCLONE) simulation model enclosed in advance for its simulation nodes. CYCLONE simulation model takes charge of simulating stochastic duration of activity and efficiency of resources, while CPM model performs project scheduling. This combination avoids the shortcomings of both models. Furthermore, geographic information system (GIS) technique is utilized to visualize the construction processes which are otherwise difficult to be understood by static results described. Application in practical project verifies the feasibility and advantage of the technique.

  3. Analysis for Radiation and Shielding Dose in Plasma Focus Neutron Source Using FLUKA

    Science.gov (United States)

    Nemati, M. J.; Amrollahi, R.; Habibi, M.

    2012-06-01

    Monte Carlo simulations have been performed for the attenuation of neutron radiation produced at Plasma focus (PF) devices through various shielding design. At the test site it will be fired with deuterium and tritium (D-T) fusion resulting in a yield of about 1013 fusion neutrons of 14 MeV. This poses a radiological hazard to scientists and personnel operating the device. The goal of this paper was to evaluate various shielding options under consideration for the PF operating with D-T fusion. Shields of varying neutrons-shielding effectiveness were investigated using concrete, polyethylene, paraffin and borated materials. The most effective shield, a labyrinth structure, allowed almost 1,176 shots per year while keeping personnel under 20 mSV of dose. The most expensive shield that used, square shield with 100 cm concrete thickness on the walls and Borated paraffin along with borated polyethylene added outside the concrete allowed almost 15,000 shot per year.

  4. Motion control simulation based on VR for humanoid robot

    Science.gov (United States)

    He, Huaiqing; Tang, Haoxuan

    2004-03-01

    This paper describes the motion control simulation based on VR for humanoid robot aiming at walking and running. To insure that the motion rhythm of humanoid robot conforms to the motion laws of humans, the body geometrical model based on skeleton and its kinematics models based on the graph of time sequences are presented firstly. Then a control algorithm based on Jacobian matrix is proposed to generate the periodical walking and running. Finally, computer simulation experiments demonstrate the feasibility of the models and the algorithm. The simulation system developed makes us interactively regulate the motion direction and velocity for humanoid robot.

  5. Research of Stamp Forming Simulation Based on Finite Element Method

    Institute of Scientific and Technical Information of China (English)

    SU Xaio-ping; XU Lian

    2008-01-01

    We point out that the finite element method offers a greta functional improvement for analyzing the stamp forming process of an automobile panel. Using the finite element theory and the simulation method of sheet stamping forming, the element model of sheet forming is built based on software HyperMesh,and the simulation of the product's sheet forming process is analyzed based on software Dynaform. A series of simulation results are obtained. It is clear that the simulation results from the theoretical basis for the product's die design and are useful for selecting process parameters.

  6. Performance Analysis Based on Timing Simulation

    DEFF Research Database (Denmark)

    Nielsen, Christian Dalsgaard; Kishinevsky, Michael

    1994-01-01

    Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomia...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits.......Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...

  7. Air Pollution Simulation based on different seasons

    Science.gov (United States)

    Muhaimin

    2017-01-01

    Simulation distribution of pollutants (SOx and NOx) emitted from Cirebon power plant activities have been carried out. Gaussian models and scenarios are used to predict the concentration of pollutant gasses. The purposes of this study were to determine the distribution of the flue gas from the power plant activity and differences pollutant gas concentrations in the wet and dry seasons. The result showed that the concentration of pollutant gasses in the dry season was higher than the wet season. The difference of pollutant concentration because of wind speed, gas flow rate, and temperature of the gas that flows out of the chimney. The maximum concentration of pollutant gasses in wet season for SOx is 30.14 µg/m3, while NOx is 26.35 µg/m3. Then, The simulation of air pollution in the dry season for SOx is 42.38 µg/m3, while NOx is 34.78 µg/m3.

  8. Physically based simulation of Buddha Glory

    Institute of Scientific and Technical Information of China (English)

    LIU Shiguang; WANG Zhangye; GONG Zheng; WANG Changbo; PENG Qunsheng

    2006-01-01

    This paper proposes a novel method of photorealistic simulation of Buddha Glory, a natural phenomenon of great visual beauty which can be observed in the area of famous Buddhism Holy Lands in China. This phenomenon is mainly caused by back-scattering of water droplets in fog or cloud. To simulate the glory, we first calculate the spectral scatter intensity of glory rings using Mie scattering theory. We then present a new shadow model to determine the deformed shape of the head and body of "Buddha" within the glory ring.The affect of atmospheric parameters such as the solar elevation angles, the density and size of water droplet in fog or cloud on the shape and color of Buddha Glory is also calculated. For rendering, we adopt the method of path scattering integral to generate the whole attenuated scene of Buddha Glory. Compared with the photographs of real Buddha Glory displays, our synthetic results are quite satisfactory.

  9. Simulation-based training in echocardiography.

    Science.gov (United States)

    Biswas, Monodeep; Patel, Rajendrakumar; German, Charles; Kharod, Anant; Mohamed, Ahmed; Dod, Harvinder S; Kapoor, Poonam Malhotra; Nanda, Navin C

    2016-10-01

    The knowledge gained from echocardiography is paramount for the clinician in diagnosing, interpreting, and treating various forms of disease. While cardiologists traditionally have undergone training in this imaging modality during their fellowship, many other specialties are beginning to show interest as well, including intensive care, anesthesia, and primary care trainees, in both transesophageal and transthoracic echocardiography. Advances in technology have led to the development of simulation programs accessible to trainees to help gain proficiency in the nuances of obtaining quality images, in a low stress, pressure free environment, often with a functioning ultrasound probe and mannequin that can mimic many of the pathologies seen in living patients. Although there are various training simulation programs each with their own benefits and drawbacks, it is clear that these programs are a powerful tool in educating the trainee and likely will lead to improved patient outcomes. © 2016, Wiley Periodicals, Inc.

  10. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework

    Science.gov (United States)

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811

  11. Do Army Helicopter Training Simulators Need Motion Bases?

    OpenAIRE

    McCauley, Michael E.

    2006-01-01

    United States Army Research Institute for the Behavioral and Social Sciences This report reviews the arguments and the evidence regarding the need for simulator motion bases in training helicopter pilots. It discusses flight simulators, perceptual fidelity, history of motion bases, disturbance versus maneuver motion, human motion sensation, and reviews the empirical evidence for the training effectiveness of motion bases. The section on training effectiveness reviews research f...

  12. Haptic Feedback for the GPU-based Surgical Simulator

    DEFF Research Database (Denmark)

    Sørensen, Thomas Sangild; Mosegaard, Jesper

    2006-01-01

    The GPU has proven to be a powerful processor to compute spring-mass based surgical simulations. It has not previously been shown however, how to effectively implement haptic interaction with a simulation running entirely on the GPU. This paper describes a method to calculate haptic feedback...... with limited performance cost. It allows easy balancing of the GPU workload between calculations of simulation, visualisation, and the haptic feedback....

  13. Do Army Helicopter Training Simulators Need Motion Bases

    Science.gov (United States)

    2006-02-01

    the Air Force Advanced Simulator for Pilot Training ( ASPT ). Pilots were trained in the simulator, with and without 6 DoF motion, in basic contact...approach and landing, and aerobatic tasks. No information is available about the quality of the motion provided by the ASPT motion base. The authors...in the T-37 aircraft. Hagin (1976) used the ASPT to expose Air Force pilot trainees to simulator training with and without 6 DoF motion. Flight tasks

  14. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  15. Simulation-based design using wavelets

    Science.gov (United States)

    Williams, John R.; Amaratunga, Kevin S.

    1994-03-01

    The design of large-scale systems requires methods of analysis which have the flexibility to provide a fast interactive simulation capability, while retaining the ability to provide high-order solution accuracy when required. This suggests that a hierarchical solution procedure is required that allows us to trade off accuracy for solution speed in a rational manner. In this paper, we examine the properties of the biorthogonal wavelets recently constructed by Dahlke and Weinreich and show how they can be used to implement a highly efficient multiscale solution procedure for solving a certain class of one-dimensional problems.

  16. CUDA-based real time surgery simulation.

    Science.gov (United States)

    Liu, Youquan; De, Suvranu

    2008-01-01

    In this paper we present a general software platform that enables real time surgery simulation on the newly available compute unified device architecture (CUDA)from NVIDIA. CUDA-enabled GPUs harness the power of 128 processors which allow data parallel computations. Compared to the previous GPGPU, it is significantly more flexible with a C language interface. We report implementation of both collision detection and consequent deformation computation algorithms. Our test results indicate that the CUDA enables a twenty times speedup for collision detection and about fifteen times speedup for deformation computation on an Intel Core 2 Quad 2.66 GHz machine with GeForce 8800 GTX.

  17. A Novel Software Simulator Model Based on Active Hybrid Architecture

    Directory of Open Access Journals (Sweden)

    Amr AbdElHamid

    2015-01-01

    Full Text Available The simulated training is an important issue for any type of missions such as aerial, ground, sea, or even space missions. In this paper, a new flexible aerial simulator based on active hybrid architecture is introduced. The simulator infrastructure is applicable to any type of training missions and research activities. This software-based simulator is tested on aerial missions to prove its applicability within time critical systems. The proposed active hybrid architecture is introduced via using the VB.NET and MATLAB in the same simulation loop. It exploits the remarkable computational power of MATLAB as a backbone aircraft model, and such mathematical model provides realistic dynamics to the trainee. Meanwhile, the Human-Machine Interface (HMI, the mission planning, the hardware interfacing, data logging, and MATLAB interfacing are developed using VB.NET. The proposed simulator is flexible enough to perform navigation and obstacle avoidance training missions. The active hybrid architecture is used during the simulated training, and also through postmission activities (like the generation of signals playback reports for evaluation purposes. The results show the ability of the proposed architecture to fulfill the aerial simulator demands and to provide a flexible infrastructure for different simulated mission requirements. Finally, a comparison with some existing simulators is introduced.

  18. Nonstandard FDTD Simulation-Based Design of CROW Wavelength Splitters

    Directory of Open Access Journals (Sweden)

    Naoki Okada

    2011-01-01

    Full Text Available The finite-difference time-domain (FDTD algorithm has been used in simulation-based designs of many optical devices, but it fails to reproduce high-Q whispering gallery modes (WGMs. On the other hand, the nonstandard (NS FDTD algorithm can accurately compute WGMs and can be used to make simulation-based designs of WGM devices. Wavelength splitters using the coupled resonator optical waveguides (CROWs based on WGM couplings have recently attracted attention because they are potentially ultracompact. In this paper, we design a CROW wavelength splitter using NS FDTD simulations and demonstrate high interchannel extinction ratios of over 20 dB.

  19. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  20. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  1. A simulation based engineering method to support HAZOP studies

    DEFF Research Database (Denmark)

    Enemark-Rasmussen, Rasmus; Cameron, David; Angelo, Per Bagge

    2012-01-01

    HAZOP is the most commonly used process hazard analysis tool in industry, a systematic yet tedious and time consuming method. The aim of this study is to explore the feasibility of process dynamic simulations to facilitate the HAZOP studies. We propose a simulation-based methodology to complement...

  2. Conceptual modeling for simulation-based serious gaming

    NARCIS (Netherlands)

    van der Zee, D.J.; Holkenborg, Bart; Robinson, Stewart

    2012-01-01

    In recent years many simulation-based serious games have been developed for supporting (future) managers in operations management decision making. They illustrate the high potential of using discrete event simulation for pedagogical purposes. Unfortunately, this potential does not seem to go togethe

  3. Simulation based virtual learning environment in medical genetics counseling

    DEFF Research Database (Denmark)

    Makransky, Guido; Bonde, Mads T.; Wulff, Julie S. G.

    2016-01-01

    that they would feel more confident counseling a patient after the simulation. CONCLUSIONS: The simulation based learning environment increased students' learning, intrinsic motivation, and self-efficacy (although the strength of these effects differed depending on their pre-test knowledge), and increased...

  4. Energy system simulation in performance-based building design

    NARCIS (Netherlands)

    Wilde, P.J.C.J. de; Augenbroe, G.; Voorden, M. van der

    2002-01-01

    This paper discusses the requirements and possible solutions for the use of building simulation tools as instrument to support performance-based building design decisions. Use of an existing simulation tool to support a specific building design decision (the selection of energy saving building compo

  5. Cavitation-based hydro-fracturing simulator

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jy-An John; Wang, Hong; Ren, Fei; Cox, Thomas S.

    2016-11-22

    An apparatus 300 for simulating a pulsed pressure induced cavitation technique (PPCT) from a pressurized working fluid (F) provides laboratory research and development for enhanced geothermal systems (EGS), oil, and gas wells. A pump 304 is configured to deliver a pressurized working fluid (F) to a control valve 306, which produces a pulsed pressure wave in a test chamber 308. The pulsed pressure wave parameters are defined by the pump 304 pressure and control valve 306 cycle rate. When a working fluid (F) and a rock specimen 312 are included in the apparatus, the pulsed pressure wave causes cavitation to occur at the surface of the specimen 312, thus initiating an extensive network of fracturing surfaces and micro fissures, which are examined by researchers.

  6. Discrete Element Simulation of Asphalt Mastics Based on Burgers Model

    Institute of Scientific and Technical Information of China (English)

    LIU Yu; FENG Shi-rong; HU Xia-guang

    2007-01-01

    In order to investigate the viscoelastic performance of asphalt mastics, a micro-mechanical model for asphalt mastics was built by applying Burgers model to discrete element simulation and constructing Burgers contact model. Then the numerical simulation of creep tests was conducted, and results from the simulation were compared with the analytical solution for Burgers model. The comparision snowed that the two results agreed well with each other, suggesting that discrete element model based on Burgers model could be employed in the numerical simulation for asphalt mastics.

  7. Template-Based Geometric Simulation of Flexible Frameworks

    Science.gov (United States)

    Wells, Stephen A.; Sartbaeva, Asel

    2012-01-01

    Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers. PMID:28817055

  8. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  9. Use of agent based simulation for traffic safety assessment

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2008-07-01

    Full Text Available This paper describes the development of an agent based Computational Building Simulation (CBS) tool, termed KRONOS that is being used to work on advanced research questions such as traffic safety assessment and user behaviour in buildings...

  10. 6 5KVA POWER INVERTER DESIGN AND SIMULATION BASED ...

    African Journals Online (AJOL)

    DR. AMINU

    Five (5) kVA power inverter was designed and simulated base on two topologies; Boost ... stages were plotted and the results show a significant increase in the voltage and duty cycle. ... also provides the industries with effective methods to.

  11. Knowledge-based simulation using object-oriented programming

    Science.gov (United States)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  12. Runoff Simulation of Shitoukoumen Reservoir Basin Based on SWAT Model

    Institute of Scientific and Technical Information of China (English)

    XIE; Miao; LI; Hong-yan; LIU; Tie-juan; RU; Shi-rong

    2012-01-01

    [Objective]The study aimed to simulate the runoff of Shitoukoumen Reservoir basin by using SWAT model. [Method] Based on DEM elevation, land use type, soil type and hydrometeorological data, SWAT model, a distributed hydrological model was established to simulate the monthly runoff of Shitoukoumen Reservoir basin, and the years 2006 and 2010 were chosen as the calibration and validation period respectively. [Result] The simulation results indicated that SWAT model could be used to simulate the runoff of Shitoukoumen Reservoir basin, and the simulation effect was good. However, the response of the model to local rainstorm was not obvious, so that the actual runoff in June and July of 2010 was abnormally higher than the simulation value. [Conclusion] The research could provide theoretical references for the plan and management of water resources in Shitoukoumen Reservoir basin in future.

  13. CFOA-Based Lossless and Lossy Inductance Simulators

    Directory of Open Access Journals (Sweden)

    F. Kaçar

    2011-09-01

    Full Text Available Inductance simulator is a useful component in the circuit synthesis theory especially for analog signal processing applications such as filter, chaotic oscillator design, analog phase shifters and cancellation of parasitic element. In this study, new four inductance simulator topologies employing a single current feedback operational amplifier are presented. The presented topologies require few passive components. The first topology is intended for negative inductance simulation, the second topology is for lossy series inductance, the third one is for lossy parallel inductance and the fourth topology is for negative parallel (-R (-L (-C simulation. The performance of the proposed CFOA based inductance simulators is demonstrated on both a second-order low-pass filter and inductance cancellation circuit. PSPICE simulations are given to verify the theoretical analysis.

  14. Simulation of Metal Transfer in GMAW Based on FLUENT

    Institute of Scientific and Technical Information of China (English)

    Xueping DING; Huan LI; Lijun YANG; Ying GAO

    2013-01-01

    A new numerical approach is presented,which is used to simulate the dynamic process of metal transfer.The process of metal transfer in gas metal arc welding is simulated based on FLUENT.A two-dimensional axisymmetric numerical model is developed using volume of fluid method and the distributions of physical quantities including pressure,current density,electric potential in the droplet are investigated.For improving the veracity of the simulated results and decreasing the effect of the uncertain surface tension coefficient on the simulated results,the relationship between the welding current and surface tension coefficient is modified by analysis of regression.Meanwhile for testing the accuracy of simulated results,the welding experiments are performed and the high-speed photography system is used to record the real process of metal transfer.The results show that the simulated results are in reasonably good agreement with the experimental ones.

  15. Design for the simulation of space based information network

    Institute of Scientific and Technical Information of China (English)

    Zeng Bin; Li Zitang; Wang Wei

    2006-01-01

    Ongoing research is described that is focused upon modelling the space base information network and simulating its behaviours: simulation of spaced based communications and networking project. Its objective is to demonstrate the feasibility of producing a tool that can provide a performance evaluation of various constellation access techniques and routing policies. The architecture and design of the simulation system are explored. The algorithm of data routing and instrument scheduling in this project is described. Besides these, the key methodologies of simulating the inter-satellite link features in the data transmissions are also discussed. The performance of both instrument scheduling algorithm and routing schemes is evaluated and analyzed through extensive simulations under a typical scenario.

  16. An RTM based Distributed Simulation System for Guide Robot

    Directory of Open Access Journals (Sweden)

    Chen Peihua

    2013-10-01

    Full Text Available In order to enhance the robot system integration and development for guide robot, a distributed simulation system was developed in this study using RTM (Robot Technology Middleware technology, which is an open software platform for robot systems. The RT (robot technology system of an adapter, a controller and the robot, together with other CORBA objects, was developed to connect the graphical programing interface with 3D simulator to set up an RTM based distributed simulation system. Simultaneously, the application of the distributed simulation system also confirms the controlling of the real robot utilizing the RT system. The proposed distributed simulation system based on RTM can obviously accelerate the software component development as well as the system integration for guide robot, which will certainly lower the cost of the development of new robot application systems.

  17. Nonstationary multiscale turbulence simulation based on local PCA.

    Science.gov (United States)

    Beghi, Alessandro; Cenedese, Angelo; Masiero, Andrea

    2014-09-01

    Turbulence simulation methods are of fundamental importance for evaluating the performance of control strategies for Adaptive Optics (AO) systems. In order to obtain a reliable evaluation of the performance a statistically accurate turbulence simulation method has to be used. This work generalizes a previously proposed method for turbulence simulation based on the use of a multiscale stochastic model. The main contributions of this work are: first, a multiresolution local PCA representation is considered. In typical operating conditions, the computational load for turbulence simulation is reduced approximately by a factor of 4, with respect to the previously proposed method, by means of this PCA representation. Second, thanks to a different low resolution method, based on a moving average model, the wind velocity can be in any direction (not necessarily that of the spatial axes). Finally, this paper extends the simulation procedure to generate, if needed, turbulence samples by using a more general model than that of the frozen flow hypothesis.

  18. GPS SATELLITE SIMULATOR SIGNAL ESTIMATION BASED ON ANN

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Multi-channel Global Positioning System (GPS) satellite signal simulator is used to provide realistic test signals for GPS receivers and navigation systems. In this paper, signals arriving the antenna of GPS receiver are analyzed from the viewpoint of simulator design. The estimation methods are focused of which several signal parameters are difficult to determine directly according to existing experiential models due to various error factors. Based on the theory of Artificial Neural Network (ANN), an approach is proposed to simulate signal propagation delay,carrier phase, power, and other parameters using ANN. The architecture of the hardware-in-the-loop test system is given. The ANN training and validation process is described. Experimental results demonstrate that the ANN designed can statistically simulate sample data in high fidelity.Therefore the computation of signal state based on this ANN can meet the design requirement,and can be directly applied to the development of multi-channel GPS satellite signal simulator.

  19. Simulator for beam-based LHC collimator alignment

    Science.gov (United States)

    Valentino, Gianluca; Aßmann, Ralph; Redaelli, Stefano; Sammut, Nicholas

    2014-02-01

    In the CERN Large Hadron Collider, collimators need to be set up to form a multistage hierarchy to ensure efficient multiturn cleaning of halo particles. Automatic algorithms were introduced during the first run to reduce the beam time required for beam-based setup, improve the alignment accuracy, and reduce the risk of human errors. Simulating the alignment procedure would allow for off-line tests of alignment policies and algorithms. A simulator was developed based on a diffusion beam model to generate the characteristic beam loss signal spike and decay produced when a collimator jaw touches the beam, which is observed in a beam loss monitor (BLM). Empirical models derived from the available measurement data are used to simulate the steady-state beam loss and crosstalk between multiple BLMs. The simulator design is presented, together with simulation results and comparison to measurement data.

  20. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  1. MAIA: a framework for developing agent-based social simulations

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dignum, Virginia; Bots, Pieter; Dijkema, Gerhard

    2013-01-01

    In this paper we introduce and motivate a conceptualization framework for agent-based social simulation, MAIA: Modelling Agent systems based on Institutional Analysis. The MAIA framework is based on Ostrom's Institutional Analysis and Development framework, and provides an extensive set of modelling

  2. Simulation-based learning: Just like the real thing

    Directory of Open Access Journals (Sweden)

    Lateef Fatimah

    2010-01-01

    Full Text Available Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology to replace and amplify real experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals′ knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors.

  3. Simulation-based learning: Just like the real thing.

    Science.gov (United States)

    Lateef, Fatimah

    2010-10-01

    Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology) to replace and amplify real experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals' knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors.

  4. Interfacing MCNPX and McStas for simulation of neutron transport

    OpenAIRE

    Klinkby, Esben Bryndt; Lauritzen, Bent; Nonbøl, Erik; Willendrup, Peter Kjær; Filges, Uwe; Wohlmuther, Michael; Gallmeier, Franz X.

    2013-01-01

    Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX[1] or FLUKA[2, 3] whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as McStas[4, 5, 6, 7]. The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limit...

  5. Investigating Output Accuracy for a Discrete Event Simulation Model and an Agent Based Simulation Model

    CERN Document Server

    Majid, Mazlina Abdul; Siebers, Peer-Olaf

    2010-01-01

    In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store's fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

  6. Constraint-based soft tissue simulation for virtual surgical training.

    Science.gov (United States)

    Tang, Wen; Wan, Tao Ruan

    2014-11-01

    Most of surgical simulators employ a linear elastic model to simulate soft tissue material properties due to its computational efficiency and the simplicity. However, soft tissues often have elaborate nonlinear material characteristics. Most prominently, soft tissues are soft and compliant to small strains, but after initial deformations they are very resistant to further deformations even under large forces. Such material characteristic is referred as the nonlinear material incompliant which is computationally expensive and numerically difficult to simulate. This paper presents a constraint-based finite-element algorithm to simulate the nonlinear incompliant tissue materials efficiently for interactive simulation applications such as virtual surgery. Firstly, the proposed algorithm models the material stiffness behavior of soft tissues with a set of 3-D strain limit constraints on deformation strain tensors. By enforcing a large number of geometric constraints to achieve the material stiffness, the algorithm reduces the task of solving stiff equations of motion with a general numerical solver to iteratively resolving a set of constraints with a nonlinear Gauss-Seidel iterative process. Secondly, as a Gauss-Seidel method processes constraints individually, in order to speed up the global convergence of the large constrained system, a multiresolution hierarchy structure is also used to accelerate the computation significantly, making interactive simulations possible at a high level of details. Finally, this paper also presents a simple-to-build data acquisition system to validate simulation results with ex vivo tissue measurements. An interactive virtual reality-based simulation system is also demonstrated.

  7. Shielding evaluation of neutron generator hall by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Pujala, U.; Selvakumaran, T.S.; Baskaran, R.; Venkatraman, B. [Radiological Safety Division, Indira Gandhi Center for Atomic Research, Kalpakkam (India); Thilagam, L.; Mohapatra, D.K., E-mail: swathythila2@yahoo.com [Safety Research Institute, Atomic Energy Regulatory Board, Kalpakkam (India)

    2017-04-01

    A shielded hall was constructed for accommodating a D-D, D-T or D-Be based pulsed neutron generator (NG) with 4π yield of 10{sup 9} n/s. The neutron shield design of the facility was optimized using NCRP-51 methodology such that the total dose rates outside the hall areas are well below the regulatory limit for full occupancy criterion (1 μSv/h). However, the total dose rates at roof top, cooling room trench exit and labyrinth exit were found to be above this limit for the optimized design. Hence, additional neutron shielding arrangements were proposed for cooling room trench and labyrinth exits. The roof top was made inaccessible. The present study is an attempt to evaluate the neutron and associated capture gamma transport through the bulk shields for the complete geometry and materials of the NG-Hall using Monte Carlo (MC) codes MCNP and FLUKA. The neutron source terms of D-D, D-T and D-Be reactions are considered in the simulations. The effect of additional shielding proposed has been demonstrated through the simulations carried out with the consideration of the additional shielding for D-Be neutron source term. The results MC simulations using two different codes are found to be consistent with each other for neutron dose rate estimates. However, deviation up to 28% is noted between these two codes at few locations for capture gamma dose rate estimates. Overall, the dose rates estimated by MC simulations including additional shields shows that all the locations surrounding the hall satisfy the full occupancy criteria for all three types of sources. Additionally, the dose rates due to direct transmission of primary neutrons estimated by FLUKA are compared with the values calculated using the formula given in NCRP-51 which shows deviations up to 50% with each other. The details of MC simulations and NCRP-51 methodology for the estimation of primary neutron dose rate along with the results are presented in this paper. (author)

  8. Decision Manifold Approximation for Physics-Based Simulations

    Science.gov (United States)

    Wong, Jay Ming; Samareh, Jamshid A.

    2016-01-01

    With the recent surge of success in big-data driven deep learning problems, many of these frameworks focus on the notion of architecture design and utilizing massive databases. However, in some scenarios massive sets of data may be difficult, and in some cases infeasible, to acquire. In this paper we discuss a trajectory-based framework that quickly learns the underlying decision manifold of binary simulation classifications while judiciously selecting exploratory target states to minimize the number of required simulations. Furthermore, we draw particular attention to the simulation prediction application idealized to the case where failures in simulations can be predicted and avoided, providing machine intelligence to novice analysts. We demonstrate this framework in various forms of simulations and discuss its efficacy.

  9. Moral imagination in simulation-based communication skills training.

    Science.gov (United States)

    Chen, Ruth P

    2011-01-01

    Clinical simulation is used in nursing education and in other health professional programs to prepare students for future clinical practice. Simulation can be used to teach students communication skills and how to deliver bad news to patients and families. However, skilled communication in clinical practice requires students to move beyond simply learning superficial communication techniques and behaviors. This article presents an unexplored concept in the simulation literature: the exercise of moral imagination by the health professional student. Drawing from the works of Hume, Aristotle and Gadamer, a conceptualization of moral imagination is first provided. Next, this article argues that students must exercise moral imagination on two levels: towards the direct communication exchange before them; and to the representative nature of simulation encounters. Last, the limits of moral imagination in simulation-based education are discussed.

  10. Colour based sorting station with Matlab simulation

    Directory of Open Access Journals (Sweden)

    Constantin Victor

    2017-01-01

    Full Text Available The paper presents the design process and manufacturing elements of a colour-based sorting station. The system is comprised of a gravitational storage, which also contains the colour sensor. Parts are extracted using a linear pneumatic motor and are fed onto an electrically driven conveyor belt. Extraction of the parts is done at 4 points, using two pneumatic motors and a geared DC motor, while the 4th position is at the end of the belt. The mechanical parts of the system are manufactured using 3D printer technology, allowing for easy modification and adaption to the geometry of different parts. The paper shows all of the stages needed to design, optimize, test and implement the proposed solution. System optimization was performed using a graphical Matlab interface which also allows for sorting algorithm optimization.

  11. SIDH: A Game-Based Architecture for a Training Simulator

    Directory of Open Access Journals (Sweden)

    P. Backlund

    2009-01-01

    Full Text Available Game-based simulators, sometimes referred to as “lightweight” simulators, have benefits such as flexible technology and economic feasibility. In this article, we extend the notion of a game-based simulator by introducing multiple screen view and physical interaction. These features are expected to enhance immersion and fidelity. By utilizing these concepts we have constructed a training simulator for breathing apparatus entry. Game hardware and software have been used to produce the application. More important, the application itself is deliberately designed to be a game. Indeed, one important design goal is to create an entertaining and motivating experience combined with learning goals in order to create a serious game. The system has been evaluated in cooperation with the Swedish Rescue Services Agency to see which architectural features contribute to perceived fidelity. The modes of visualization and interaction as well as level design contribute to the usefulness of the system.

  12. Province Based Design and Simulation of Indonesian Education Grid Topology

    Directory of Open Access Journals (Sweden)

    Heru Suhartanto

    2012-01-01

    Full Text Available This paper discusses the design and simulation of an e-learning computer network topology, based on Grid computing technology, for Indonesian schools called the Indonesian Education Grid (IndoEdu-Grid. The grid technology proposed to solve infrastructure problems faced by Indonesian ICT Network (Jardiknas. In previous study, we designed the topology which based on two scenarios: region based and island based topology. Each scenario run in the simulator using two packet scheduling algorithms, one will be FIFO (First In First Out Scheduler and the other SCFQ (Self-Clocked Fair Queuing Scheduler. In this paper we proposed a different scenario which based on province. The simulation treatments are the same with the two previous scenarios. The simulation results showed that when using FIFO algorithm, the province based scenario has the best performance compared to Region Based and Island Based. However, this scenario is not competitive with the others when using SCFQ algorithm which is due to higher packet lifetime.

  13. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  14. Microbial Growth Modeling and Simulation Based on Cellular Automata

    Directory of Open Access Journals (Sweden)

    Hong Men

    2013-07-01

    Full Text Available In order to simulate the micro-evolutionary process of the microbial growth, [Methods] in this study, we adopt two-dimensional cellular automata as its growth space. Based on evolutionary mechanism of microbial and cell-cell interactions, we adopt Moore neighborhood and make the transition rules. Finally, we construct the microbial growth model. [Results] It can describe the relationships among the cell growth, division and death. And also can effectively reflect spatial inhibition effect and substrate limitation effect. [Conclusions] The simulation results show that CA model is not only consistent with the classic microbial kinetic model, but also be able to simulate the microbial growth and evolution.

  15. A simulation based engineering method to support HAZOP studies

    DEFF Research Database (Denmark)

    Enemark-Rasmussen, Rasmus; Cameron, David; Angelo, Per Bagge

    2012-01-01

    HAZOP is the most commonly used process hazard analysis tool in industry, a systematic yet tedious and time consuming method. The aim of this study is to explore the feasibility of process dynamic simulations to facilitate the HAZOP studies. We propose a simulation-based methodology to complement...... the conventional HAZOP procedure. The method systematically generates failure scenarios by considering process equipment deviations with pre-defined failure modes. The effect of failure scenarios is then evaluated using dynamic simulations -in this study the K-Spice® software used. The consequences of each failure...... model as case study....

  16. Behaviour and Perception-based Pedestrian Evacuation Simulation

    CERN Document Server

    Kretz, Tobias; Muehlberger, Andreas

    2010-01-01

    This contribution reports on the research project SKRIBT and some of its results. An evacuation simulation based on VISSIM's pedestrian dynamics simulation was developed, that -- with high time resolution -- integrates results from studies on behavior in stress and crisis situations, results from CFD models for e.g. fire dynamics simulations, and considers visibility of signage and -- adding a psychological model -- its cognition. A crucial issue is the cognition of smoke or fire by the occupant and his / her resulting spontaneous or deliberate reaction to this episode.

  17. A Matlab—Based Simulation for Hybrid Electric Motorcycle

    Institute of Scientific and Technical Information of China (English)

    邵定国; 李永斌; 汪信尧; 江建中

    2003-01-01

    This paper presents a simulation and modeling package based on Matlab for a parallel hybrid electric motorcycle (HEM).The package consists of several main detailed models: internal combustion engine (ICE), motor, continuously variable transmission(CVT), battery, energy management system (EMS) etc. Each component is built as a library, and can be connected together accord-ing to the parallel HEM's topology. Simulation results, such as ICE power demand, motor power demand, battery instantaneous state-of-charge (SOC), pollution emissions etc. Are given and discussed. Lastly experimental data verify our simulation results.

  18. Tutorial on Agent-based Modeling and Simulation

    Science.gov (United States)

    2007-06-01

    World of Science. New York: Wiley Crichton , Michael , 2002, Prey, HarperCollins. Epstein JM, Axtell R. 1996. Growing Artificial Societies...other author(s): Michael J. North and Charles M. Macal Principal Author’s Organization and address: Argonne National Laboratory 9700 S. Cass Avenue...on Agent-based Modeling and Simulation Michael J. North and Charles M. Macal Center for Complex Adaptive Agent Systems Simulation (CAS2) Decision

  19. Microprocessor-based simulator of surface ECG signals

    Energy Technology Data Exchange (ETDEWEB)

    MartInez, A E [Catedra de BioingenierIa II, Facultad de Ingenieria, Universidad Nacional de Entre Rios (FI-UNER), Ruta Provincial 11 Km.10 Oro Verde (Dpto. Parana) - Entre Rios (Argentina); Rossi, E [Catedra de BioingenierIa II, Facultad de Ingenieria, Universidad Nacional de Entre Rios (FI-UNER), Ruta Provincial 11 Km.10 Oro Verde (Dpto. Parana) - Entre Rios (Argentina); Siri, L Nicola [Catedra de BioingenierIa II, Facultad de Ingenieria, Universidad Nacional de Entre Rios (FI-UNER), Ruta Provincial 11 Km.10 Oro Verde (Dpto. Parana) - Entre Rios (Argentina)

    2007-11-15

    In this work, a simulator of surface electrocardiogram recorded signals (ECG) is presented. The device, based on a microcontroller and commanded by a personal computer, produces an analog signal resembling actual ECGs, not only in time course and voltage levels, but also in source impedance. The simulator is a useful tool for electrocardiograph calibration and monitoring, to incorporate as well in educational tasks and in clinical environments for early detection of faulty behaviour.

  20. Reprint of: Simulation Platform: a cloud-based online simulation environment.

    Science.gov (United States)

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-11-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software.

  1. A java based simulator with user interface to simulate ventilated patients

    Directory of Open Access Journals (Sweden)

    Stehle P.

    2015-09-01

    Full Text Available Mechanical ventilation is a life-saving intervention, which despite its use on a routine basis, poses the risk of inflicting further damage to the lung tissue if ventilator settings are chosen inappropriately. Medical decision support systems may help to prevent such injuries while providing the optimal settings to reach a defined clinical goal. In order to develop and verify decision support algorithms, a test bench simulating a patient’s behaviour is needed. We propose a Java based system that allows simulation of respiratory mechanics, gas exchange and cardiovascular dynamics of a mechanically ventilated patient. The implemented models are allowed to interact and are interchangeable enabling the simulation of various clinical scenarios. Model simulations are running in real-time and show physiologically plausible results.

  2. Biomolecular Simulation of Base Excision Repair and Protein Signaling

    Energy Technology Data Exchange (ETDEWEB)

    Straatsma, TP; McCammon, J A; Miller, John H; Smith, Paul E; Vorpagel, Erich R; Wong, Chung F; Zacharias, Martin W

    2006-03-03

    The goal of the Biomolecular Simulation of Base Excision Repair and Protein Signaling project is to enhance our understanding of the mechanism of human polymerase-β, one of the key enzymes in base excision repair (BER) and the cell-signaling enzymes cyclic-AMP-dependent protein kinase. This work used molecular modeling and simulation studies to specifically focus on the • dynamics of DNA and damaged DNA • dynamics and energetics of base flipping in DNA • mechanism and fidelity of nucleotide insertion by BER enzyme human polymerase-β • mechanism and inhibitor design for cyclic-AMP-dependent protein kinase. Molecular dynamics simulations and electronic structure calculations have been performed using the computer resources at the Molecular Science Computing Facility at the Environmental Molecular Sciences Laboratory.

  3. Identifying content for simulation-based curricula in urology

    DEFF Research Database (Denmark)

    Nayahangan, Leizl Joy; Bølling Hansen, Rikke; Lindorff-Larsen, Karen

    2017-01-01

    to identify technical procedures in urology that should be included in a simulation-based curriculum for residency training. MATERIALS AND METHODS: A national needs assessment was performed using the Delphi method involving 56 experts with significant roles in the education of urologists. Round 1 identified......OBJECTIVE: Simulation-based training is well recognized in the transforming field of urological surgery; however, integration into the curriculum is often unstructured. Development of simulation-based curricula should follow a stepwise approach starting with a needs assessment. This study aimed...... technical procedures that newly qualified urologists should perform. Round 2 included a survey using an established needs assessment formula to explore: the frequency of procedures; the number of physicians who should be able to perform the procedure; the risk and/or discomfort to patients when a procedure...

  4. Meshless thin-shell simulation based on global conformal parameterization.

    Science.gov (United States)

    Guo, Xiaohu; Li, Xin; Bao, Yunfan; Gu, Xianfeng; Qin, Hong

    2006-01-01

    This paper presents a new approach to the physically-based thin-shell simulation of point-sampled geometry via explicit, global conformal point-surface parameterization and meshless dynamics. The point-based global parameterization is founded upon the rigorous mathematics of Riemann surface theory and Hodge theory. The parameterization is globally conformal everywhere except for a minimum number of zero points. Within our parameterization framework, any well-sampled point surface is functionally equivalent to a manifold, enabling popular and powerful surface-based modeling and physically-based simulation tools to be readily adapted for point geometry processing and animation. In addition, we propose a meshless surface computational paradigm in which the partial differential equations (for dynamic physical simulation) can be applied and solved directly over point samples via Moving Least Squares (MLS) shape functions defined on the global parametric domain without explicit connectivity information. The global conformal parameterization provides a common domain to facilitate accurate meshless simulation and efficient discontinuity modeling for complex branching cracks. Through our experiments on thin-shell elastic deformation and fracture simulation, we demonstrate that our integrative method is very natural, and that it has great potential to further broaden the application scope of point-sampled geometry in graphics and relevant fields.

  5. Cloth Simulation Based Motion Capture of Dressed Humans

    Science.gov (United States)

    Hasler, Nils; Rosenhahn, Bodo; Seidel, Hans-Peter

    Commonly, marker based as well as markerless motion capture systems assume that the tracked person is wearing tightly fitting clothes. Unfortunately, this restriction cannot be satisfied in many situations and most preexisting video data does not adhere to it either. In this work we propose a graphics based vision approach for tracking humans markerlessly without making this assumption. Instead, a physically based simulation of the clothing the tracked person is wearing is used to guide the tracking algorithm.

  6. Research of Simulation in Character Animation Based on Physics Engine

    Directory of Open Access Journals (Sweden)

    Yang Yu

    2017-01-01

    Full Text Available Computer 3D character animation essentially is a product, which is combined with computer graphics and robotics, physics, mathematics, and the arts. It is based on computer hardware and graphics algorithms and related sciences rapidly developed new technologies. At present, the mainstream character animation technology is based on the artificial production of key technologies and capture frames based on the motion capture device technology. 3D character animation is widely used not only in the production of film, animation, and other commercial areas but also in virtual reality, computer-aided education, flight simulation, engineering simulation, military simulation, and other fields. In this paper, we try to study physics based character animation to solve these problems such as poor real-time interaction that appears in the character, low utilization rate, and complex production. The paper deeply studied the kinematics, dynamics technology, and production technology based on the motion data. At the same time, it analyzed ODE, PhysX, Bullet, and other variety of mainstream physics engines and studied OBB hierarchy bounding box tree, AABB hierarchical tree, and other collision detection algorithms. Finally, character animation based on ODE is implemented, which is simulation of the motion and collision process of a tricycle.

  7. CONCEPTUAL MECHANICAL DESIGN METHOD BASED ON QUALITATIVE SIMULATION

    Institute of Scientific and Technical Information of China (English)

    Yang Jianming; Yang Ping; Zou Bailong; Yang Chengsan

    2005-01-01

    A model for conceptual design of mechanical devices is studied based on qualitative simulation. In this model, the desired functions are expressed by state-transit-diagrams(ST-diagrams)and design space is represented by qualitative-state-curves(QS-curves). The first design idea, called seeds idea, is proposed by the designer and then is abstracted into QS-curves. The qualitative simulation is implemented based on the QS-curves. By changing the motion of acting parts, the connection of parts and the motion of driving part, new design ideas are generated. With this model, a series of new design ideas derived from the seeds idea can be achieved.

  8. Finite element based simulation of dry sliding wear

    Science.gov (United States)

    Hegadekatte, V.; Huber, N.; Kraft, O.

    2005-01-01

    In order to predict wear and eventually the life-span of complex mechanical systems, several hundred thousand operating cycles have to be simulated. Therefore, a finite element (FE) post-processor is the optimum choice, considering the computational expense. A wear simulation approach based on Archard's wear law is implemented in an FE post-processor that works in association with a commercial FE package, ABAQUS, for solving the general deformable-deformable contact problem. Local wear is computed and then integrated over the sliding distance using the Euler integration scheme. The wear simulation tool works in a loop and performs a series of static FE-simulations with updated surface geometries to get a realistic contact pressure distribution on the contacting surfaces. It will be demonstrated that this efficient approach can simulate wear on both two-dimensional and three-dimensional surface topologies. The wear on both the interacting surfaces is computed using the contact pressure distribution from a two-dimensional or three-dimensional simulation, depending on the case. After every wear step the geometry is re-meshed to correct the deformed mesh due to wear, thus ensuring a fairly uniform mesh for further processing. The importance and suitability of such a wear simulation tool will be enunciated in this paper.

  9. Using Crowdsourced Geodata for Agent-Based Indoor Evacuation Simulations

    Directory of Open Access Journals (Sweden)

    Alexander Zipf

    2012-08-01

    Full Text Available Crowdsourced geodata has been proven to be a rich and major data source for environmental simulations and analysis, as well as the visualization of spatial phenomena. With the increasing size and complexity of public buildings, such as universities or hotels, there is also an increasing demand for information about indoor spaces. Trying to stimulate this growing demand, both researchers and Volunteered Geographic Information (VGI communities envision to extend established communities towards indoors. It has already been showcased that VGI from OpenStreetMap (OSM can be utilized for different applications in Spatial Data Infrastructures (SDIs as well as for simple shortest path computations inside buildings. The here presented research now tries to utilize crowdsourced indoor geodata for more complex indoor routing scenarios of multiple users. Essentially, it will be investigated if, and to what extent, the available data can be utilized for performing indoor evacuation simulations with the simulation framework MATSim. That is, this paper investigates the suitability of crowdsourced indoor information from OSM (IndoorOSM for evacuation simulations. Additionally, the applicability of MATSim for agent-based indoor evacuation simulations is conducted. The paper discusses the automatic generation simulation-related data, and provides experimental results for two different evacuation scenarios. Furthermore, limitations of the IndoorOSM data and the MATSim framework for indoor evacuation simulations are elaborated and discussed.

  10. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.

    Science.gov (United States)

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo

    2016-12-13

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.

  11. A particle-based method for granular flow simulation

    KAUST Repository

    Chang, Yuanzhang

    2012-03-16

    We present a new particle-based method for granular flow simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke\\'s law, is included in the momentum governing equation to handle the friction of granular materials. Viscosity force is also added to simulate the dynamic friction for the purpose of smoothing the velocity field and further maintaining the simulation stability. Benefiting from the Lagrangian nature of the SPH method, large flow deformation can be well handled easily and naturally. In addition, a signed distance field is also employed to enforce the solid boundary condition. The experimental results show that the proposed method is effective and efficient for handling the flow of granular materials, and different kinds of granular behaviors can be well simulated by adjusting just one parameter. © 2012 Science China Press and Springer-Verlag Berlin Heidelberg.

  12. Investigation of Diesel Engine Performance Based on Simulation

    Directory of Open Access Journals (Sweden)

    Semin

    2008-01-01

    Full Text Available The single cylinder modeling and simulation for four-stroke direct-injection diesel engine requires the use of advanced analysis and development tools to carry out of performance the diesel engine model. The simulation and computational development of modeling for the research use the commercial of GT-SUITE 6.2 software. In this research, the one dimensional modeling of single cylinder for four-stroke direct-injection diesel engine developed. The analysis of the model is combustion performance process in the engine cylinder. The model simulation covers the full engine cycle consisting of intake, compression, power and exhaust. In this model it can to know the diesel engine performance effect with simulation and modeling in any speeds (rpm parameters. The performance trend of the diesel engine model developed result of this model based on the theoretical and computational model shows in graphics in the paper.

  13. Strain System for the Motion Base Shuttle Mission Simulator

    Science.gov (United States)

    Huber, David C.; Van Vossen, Karl G.; Kunkel, Glenn W.; Wells, Larry W.

    2010-01-01

    The Motion Base Shuttle Mission Simulator (MBSMS) Strain System is an innovative engineering tool used to monitor the stresses applied to the MBSMS motion platform tilt pivot frames during motion simulations in real time. The Strain System comprises hardware and software produced by several different companies. The system utilizes a series of strain gages, accelerometers, orientation sensor, rotational meter, scanners, computer, and software packages working in unison. By monitoring and recording the inputs applied to the simulator, data can be analyzed if weld cracks or other problems are found during routine simulator inspections. This will help engineers diagnose problems as well as aid in repair solutions for both current as well as potential problems.

  14. Modeling and simulation of LHC beam-based collimator setup

    CERN Document Server

    Valentino, G; Assmann, R W; Burkart, F; Redaelli, S; Rossi, A; Lari, L

    2012-01-01

    In the 2011 Large Hadron Collider run, collimators were aligned for proton and heavy ion beams using a semiautomatic setup algorithm. The algorithm provided a reduction in the beam time required for setup, an elimination of beam dumps during setup and better reproducibility with respect to manual alignment. A collimator setup simulator was developed based on a Gaussian model of the beam distribution as well as a parametric model of the beam losses. A time-varying beam loss signal can be simulated for a given collimator movement into the beam. The simulation results and comparison to measurement data obtained during collimator setups and dedicated fills for beam halo scraping are presented. The simulator will then be used to develop a fully automatic collimator alignment algorithm.

  15. Agent-Based Crowd Simulation of Daily Goods Traditional Markets

    Directory of Open Access Journals (Sweden)

    Purba D. Kusuma

    2016-10-01

    Full Text Available In traditional market, buyers are not only moving from one place to another, but also interacting with traders to purchase their products. When a buyer interacts with a trader, he blocks some space in the corridor. Besides, while buyers are walking, they may be attracted by non-preferred traders, though they may have preferred traders. These situations have not been covered in most existing crowd simulation models. Hence, these existing models cannot be directly implemented in traditional market environments since they mainly focus on crowd members’ movement. This research emphasizes on a crowd model that includes simplified movement and unplanned purchasing models. This model has been developed based on intelligent agent concept, where each agent represents a buyer. Two traditional markets are used for simulation in this research, namely Gedongkuning and Ngasem, in Yogyakarta, Indonesia. The simulation shows that some places are visited more frequently than others. Overall, the simulation result matches the situation found in the real world.

  16. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  17. Special focus on simulation: educational strategies in the NICU: simulation-based learning: it's not just for NRP.

    Science.gov (United States)

    Pilcher, Jobeth; Goodall, Heather; Jensen, Cynthia; Huwe, Valerie; Jewell, Cordelia; Reynolds, Regina; Karlsen, Kris A

    2012-01-01

    Simulations are experiential learning opportunities during which participants can learn new information, as well as have the opportunity to apply previous knowledge. While hands-on learning has been incorporated into NRP and similar training for some time, simulation-based learning is increasingly being utilized in new and varied situations. This article begins with a general overview of simulation, along with a brief review of the historical background of mannequins and simulation. This is followed by several mini-articles describing how the authors have applied simulated-based activities to promote learning. The article concludes with a look at the potential future of simulation-based education.

  18. Simulation of large-scale rule-based models

    Energy Technology Data Exchange (ETDEWEB)

    Hlavacek, William S [Los Alamos National Laboratory; Monnie, Michael I [Los Alamos National Laboratory; Colvin, Joshua [NON LANL; Faseder, James [NON LANL

    2008-01-01

    Interactions of molecules, such as signaling proteins, with multiple binding sites and/or multiple sites of post-translational covalent modification can be modeled using reaction rules. Rules comprehensively, but implicitly, define the individual chemical species and reactions that molecular interactions can potentially generate. Although rules can be automatically processed to define a biochemical reaction network, the network implied by a set of rules is often too large to generate completely or to simulate using conventional procedures. To address this problem, we present DYNSTOC, a general-purpose tool for simulating rule-based models. DYNSTOC implements a null-event algorithm for simulating chemical reactions in a homogenous reaction compartment. The simulation method does not require that a reaction network be specified explicitly in advance, but rather takes advantage of the availability of the reaction rules in a rule-based specification of a network to determine if a randomly selected set of molecular components participates in a reaction during a time step. DYNSTOC reads reaction rules written in the BioNetGen language which is useful for modeling protein-protein interactions involved in signal transduction. The method of DYNSTOC is closely related to that of STOCHSIM. DYNSTOC differs from STOCHSIM by allowing for model specification in terms of BNGL, which extends the range of protein complexes that can be considered in a model. DYNSTOC enables the simulation of rule-based models that cannot be simulated by conventional methods. We demonstrate the ability of DYNSTOC to simulate models accounting for multisite phosphorylation and multivalent binding processes that are characterized by large numbers of reactions. DYNSTOC is free for non-commercial use. The C source code, supporting documentation and example input files are available at .

  19. EIC detector simulations in FairRoot framework

    Science.gov (United States)

    Kiselev, Alexander; eRHIC task force Team

    2013-10-01

    The long-term RHIC facility upgrade plan foresees the addition of a high-energy electron beam to the existing hadron accelerator complex thus converting RHIC into an Electron-Ion Collider (eRHIC). A dedicated EIC detector, designed to efficiently register and identify deep inelastic electron scattering (DIS) processes in a wide range of center-of-mass energies is one of the key elements of this upgrade. Detailed Monte-Carlo studies are needed to optimize EIC detector components and to fine tune their design. The simulation package foreseen for this purpose (EicRoot) is based on the FairRoot framework developed and maintained at the GSI. A feature of this framework is its level of flexibility, allowing one to switch easily between different geometry (ROOT, GEANT) and transport (GEANT3, GEANT4, FLUKA) models. Apart from providing a convenient simulation environment the framework includes basic tools for visualization and allows for easy sharing of event reconstruction codes between higher level experiment-specific applications. The description of the main EicRoot features and first simulation results will be the main focus of the talk.

  20. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used.......Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable...

  1. Engineering-Based Thermal CFD Simulations on Massive Parallel Systems

    Directory of Open Access Journals (Sweden)

    Jérôme Frisch

    2015-05-01

    Full Text Available The development of parallel Computational Fluid Dynamics (CFD codes is a challenging task that entails efficient parallelization concepts and strategies in order to achieve good scalability values when running those codes on modern supercomputers with several thousands to millions of cores. In this paper, we present a hierarchical data structure for massive parallel computations that supports the coupling of a Navier–Stokes-based fluid flow code with the Boussinesq approximation in order to address complex thermal scenarios for energy-related assessments. The newly designed data structure is specifically designed with the idea of interactive data exploration and visualization during runtime of the simulation code; a major shortcoming of traditional high-performance computing (HPC simulation codes. We further show and discuss speed-up values obtained on one of Germany’s top-ranked supercomputers with up to 140,000 processes and present simulation results for different engineering-based thermal problems.

  2. Engineering-Based Thermal CFD Simulations on Massive Parallel Systems

    KAUST Repository

    Frisch, Jérôme

    2015-05-22

    The development of parallel Computational Fluid Dynamics (CFD) codes is a challenging task that entails efficient parallelization concepts and strategies in order to achieve good scalability values when running those codes on modern supercomputers with several thousands to millions of cores. In this paper, we present a hierarchical data structure for massive parallel computations that supports the coupling of a Navier–Stokes-based fluid flow code with the Boussinesq approximation in order to address complex thermal scenarios for energy-related assessments. The newly designed data structure is specifically designed with the idea of interactive data exploration and visualization during runtime of the simulation code; a major shortcoming of traditional high-performance computing (HPC) simulation codes. We further show and discuss speed-up values obtained on one of Germany’s top-ranked supercomputers with up to 140,000 processes and present simulation results for different engineering-based thermal problems.

  3. Micromechanics-Based Computational Simulation of Ceramic Matrix Composites

    Science.gov (United States)

    Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)

    2003-01-01

    Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.

  4. Measurements and simulation of induced activity at the CERN-EU high- energy reference field facility

    CERN Document Server

    Brugger, M; Mitaroff, W A; Roesler, S

    2003-01-01

    Samples of aluminum, copper, stainless steel, iron, boron nitride, carbon composite and water were irradiated by the stray radiation field produced by interactions of high-energy hadrons in a copper target. The specific activity induced in the samples was measured by gamma spectrometry. In addition, the isotope production in the samples was calculated with detailed Monte-Carlo simulations using the FLUKA code. Results of the simulation are in reasonable agreement with the measured activities. 7 Refs.

  5. Simulation-Based Evaluation of Learning Sequences for Instructional Technologies

    Science.gov (United States)

    McEneaney, John E.

    2016-01-01

    Instructional technologies critically depend on systematic design, and learning hierarchies are a commonly advocated tool for designing instructional sequences. But hierarchies routinely allow numerous sequences and choosing an optimal sequence remains an unsolved problem. This study explores a simulation-based approach to modeling learning…

  6. Web-Based Modelling and Collaborative Simulation of Declarative Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Marquard, Morten; Shahzad, Muhammad

    2015-01-01

    two years we have taken this adoption of DCR Graphs to the next level and decided to treat the notation as a product of its own by developing a stand-alone web-based collaborative portal for the modelling and simulation of declarative workflows. The purpose of the portal is to facilitate end...

  7. Toward Developing Authentic Leadership: Team-Based Simulations

    Science.gov (United States)

    Shapira-Lishchinsky, Orly

    2014-01-01

    Although there is a consensus that authentic leadership should be an essential component in educational leadership, no study to date has ever tried to find whether team-based simulations may promote authentic leadership. The purpose of this study was to identify whether principal trainees can develop authentic leadership through ethical decision…

  8. Self-directed simulation-based training of emergency cricothyroidotomy

    DEFF Research Database (Denmark)

    Melchiors, Jacob; Todsen, Tobias; Mørkeberg Nilsson, Philip

    2016-01-01

    The emergency cricothyroidotomy (EC) is a critical procedure. The high cost of failures increases the demand for evidence-based training methods. The aim of this study was to present and evaluate self-directed video-guided simulation training. Novice doctors were given an individual 1-h simulatio...

  9. Simulation-Based Constructivist Approach for Education Leaders

    Science.gov (United States)

    Shapira-Lishchinsky, Orly

    2015-01-01

    The purpose of this study was to reflect the leadership strategies that may arise using a constructivist approach based on organizational learning. This approach involved the use of simulations that focused on ethical tensions in school principals' daily experiences, and the development of codes of ethical conduct to reduce these tensions. The…

  10. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  11. Spreadsheet-Based Program for Simulating Atomic Emission Spectra

    Science.gov (United States)

    Flannigan, David J.

    2014-01-01

    A simple Excel spreadsheet-based program for simulating atomic emission spectra from the properties of neutral atoms (e.g., energies and statistical weights of the electronic states, electronic partition functions, transition probabilities, etc.) is described. The contents of the spreadsheet (i.e., input parameters, formulas for calculating…

  12. Approach channels: risk- and simulation-based design

    NARCIS (Netherlands)

    Nuygen, M.Q.

    2008-01-01

    The aim of this study is to develop new interpretation methods and overall risk assessment models for navigation aspects in association with waterway design. The methods, which are based on the results achieved from ship maneuvering simulation and numerical models, address two ship accident scenario

  13. Porting a Java-based Brain Simulation Software to C++

    CERN Document Server

    CERN. Geneva

    2015-01-01

    A currently available software solution to simulate neural development is Cx3D. However, this software is Java-based, and not ideal for high performance computing. This talk presents our step-by-step porting approach, that uses SWIG as a tool to interface C++ code from Java.

  14. Virtual Reality: Toward Fundamental Improvements in Simulation-Based Training.

    Science.gov (United States)

    Thurman, Richard A.; Mattoon, Joseph S.

    1994-01-01

    Considers the role and effectiveness of virtual reality in simulation-based training. The theoretical and practical implications of verity, integration, and natural versus artificial interface are discussed; a three-dimensional classification scheme for virtual reality is described; and the relationship between virtual reality and other…

  15. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  16. Students' Emotions in Simulation-Based Medical Education

    Science.gov (United States)

    Keskitalo, Tuulikki; Ruokamo, Heli

    2017-01-01

    Medical education is emotionally charged for many reasons, especially the fact that simulation-based learning is designed to generate emotional experiences. However, there are very few studies that concentrate on learning and emotions, despite widespread interest in the topic, especially within healthcare education. The aim of this research is to…

  17. Three-dimensional Microstructure Simulation Model of Cement Based Materials,

    NARCIS (Netherlands)

    Ye, G.; Van Breugel, K.

    2003-01-01

    This paper describes a computer-based numerical model for the simulation of the development of microstructure during cement hydration. Special emphasis is on the algorithm for characterizing the pores. This includes the porosity and the pore size distribution and the topological properties of the po

  18. Numerical simulation of base flow with hot base bleed for two jet models

    OpenAIRE

    Wen-jie Yu; Yong-gang Yu; Bin Ni

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric Navier–Stokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an...

  19. Fast spot-based multiscale simulations of granular drainage

    Energy Technology Data Exchange (ETDEWEB)

    Rycroft, Chris H.; Wong, Yee Lok; Bazant, Martin Z.

    2009-05-22

    We develop a multiscale simulation method for dense granular drainage, based on the recently proposed spot model, where the particle packing flows by local collective displacements in response to diffusing"spots'" of interstitial free volume. By comparing with discrete-element method (DEM) simulations of 55,000 spheres in a rectangular silo, we show that the spot simulation is able to approximately capture many features of drainage, such as packing statistics, particle mixing, and flow profiles. The spot simulation runs two to three orders of magnitude faster than DEM, making it an appropriate method for real-time control or optimization. We demonstrateextensions for modeling particle heaping and avalanching at the free surface, and for simulating the boundary layers of slower flow near walls. We show that the spot simulations are robust and flexible, by demonstrating that they can be used in both event-driven and fixed timestep approaches, and showing that the elastic relaxation step used in the model can be applied much less frequently and still create good results.

  20. Individual based simulations of bacterial growth on agar plates

    Science.gov (United States)

    Ginovart, M.; López, D.; Valls, J.; Silbert, M.

    2002-03-01

    The individual based simulator, INDividual DIScrete SIMulations (INDISIM) has been used to study the behaviour of the growth of bacterial colonies on a finite dish. The simulations reproduce the qualitative trends of pattern formation that appear during the growth of Bacillus subtilis on an agar plate under different initial conditions of nutrient peptone concentration, the amount of agar on the plate, and the temperature. The simulations are carried out by imposing closed boundary conditions on a square lattice divided into square spatial cells. The simulator studies the temporal evolution of the bacterial population possible by setting rules of behaviour for each bacterium, such as its uptake, metabolism and reproduction, as well as rules for the medium in which the bacterial cells grow, such as concentration of nutrient particles and their diffusion. The determining factors that characterize the structure of the bacterial colony patterns in the presents simulations, are the initial concentrations of nutrient particles, that mimic the amount of peptone in the experiments, and the set of values for the microscopic diffusion parameter related, in the experiments, to the amount of the agar medium.

  1. Dynamic Simulation of Single DNA Molecule at the Base Level

    Institute of Scientific and Technical Information of China (English)

    LEI Xiao-Ling; WANG Xiao-Feng; HU Jun; FANG Hai-Ping

    2005-01-01

    @@ A mesoscopic discrete dsDNA model at the base level is proposed based on the statistical model (Phys. Rev. Lett.82 (1999) 4560). The numerical simulations reproduce the 65 pN plateau and those on the force vs extension for different supercoiling degrees are favourable with the experimental data. Our model has potential applications on the study of short DNA segments and provides a bridge between the statistical models and atomic modelling.

  2. Dynamic Garment Simulation based on Hybrid Bounding Volume Hierarchy

    Directory of Open Access Journals (Sweden)

    Zhu Dongyong

    2016-12-01

    Full Text Available In order to solve the computing speed and efficiency problem of existing dynamic clothing simulation, this paper presents a dynamic garment simulation based on a hybrid bounding volume hierarchy. It firstly uses MCASG graph theory to do the primary segmentation for a given three-dimensional human body model. And then it applies K-means cluster to do the secondary segmentation to collect the human body’s upper arms, lower arms, upper legs, lower legs, trunk, hip and woman’s chest as the elementary units of dynamic clothing simulation. According to different shapes of these elementary units, it chooses the closest and most efficient hybrid bounding box to specify these units, such as cylinder bounding box and elliptic cylinder bounding box. During the process of constructing these bounding boxes, it uses the least squares method and slices of the human body to get the related parameters. This approach makes it possible to use the least amount of bounding boxes to create close collision detection regions for the appearance of the human body. A spring-mass model based on a triangular mesh of the clothing model is finally constructed for dynamic simulation. The simulation result shows the feasibility and superiority of the method described.

  3. Risk Reduction and Training using Simulation Based Tools - 12180

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Irin P. [Newport News Shipbuilding, Newport News, Virginia 23607 (United States)

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and S based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition

  4. GPSIM: A Personal Computer-Based GPS Simulator System

    Science.gov (United States)

    Ibrahim, D.

    Global Positioning Systems (GPS) are now in use in many applications, ranging from GIS to route guidance, automatic vehicle location (AVL), air, land, and marine navigation, and many other transportation and geographical based applications. In many applications, the GPS receiver is connected to some form of intelligent electronic system which receives the positional data from the GPS unit and then performs the required operation. When developing and testing GPS-based systems, one of the problems is that it is usually necessary to create GPS-compatible geographical data to simulate a GPS operation in real time. This paper provides the details of a Personal Computer (PC)-based GPS simulator system called GPSIM. The system receives user way-points and routes from Windows-based screen forms and then simulates a GPS operation in real time by generating most of the commonly used GPS sentences. The user-specified waypoints are divided into a number of small segments, each segment specifying a small distance in the direction of the original waypoint. The GPS sentence corresponding to the geographical coordinates of each segment is then sent out of the PC serial port. The system described is an invaluable testing tool for GPS-based system developers and also for people training to learn to use GPS-based products.

  5. Simulation of Naval Guns' Breechblock System Dynamics Based on ADAMS

    Science.gov (United States)

    Tan, Bo; Liu, Hui-Min; Liu, Kai

    In order to study the dynamical characteristics of the breechblock system during gun firing, a virtual prototype model was established based on ADAMS, in which motion and force transmission among mechanisms are realized by collision. By simulation, kinematics and dynamics properties of main components are obtained, and the relationships between the motion of breechblock and the position of breechblock opening plate are analyzed. According to the simulation results, the collision among the breechblock opening plate and the roller is discontinuous, which may make the breechblock system fail to hitch the breechblock reliably. And within allowable scope of the structure, the breechblock opening template should be installed near the upside as much as possible.

  6. Simulated Performance of a Strip-Based Upgraded VELO

    CERN Document Server

    Bird, T; Coutinho, R; Eklund, L; Ferro-Luzzi, M; Hutchcroft, D; Hynds, D; Latham, T; Schindler, H

    2014-01-01

    In order to evaluate the expected performance of an upgraded Vertex Locator based on silicon-strip sensors, two different sensor layouts and two different thermo-mechanical module designs were implemented in the LHCb simulation framework. In this note, the digitisation, clustering and track reconstruction algorithms are described, and the simulated performance, at beam conditions corresponding to instantaneous luminosities of 1 and 2$\\times 10^{33} cm^{−2}s^{−1}$, is compared to that of the current Vertex Locator. Results are presented on the material budget, strip and cluster occupancies, tracking efficiencies as well as on primary vertex, impact parameter and decay time resolutions.

  7. Monte-Carlo simulations of neutron shielding for the ATLAS forward region

    CERN Document Server

    Stekl, I; Kovalenko, V E; Vorobel, V; Leroy, C; Piquemal, F; Eschbach, R; Marquet, C

    2000-01-01

    The effectiveness of different types of neutron shielding for the ATLAS forward region has been studied by means of Monte-Carlo simulations and compared with the results of an experiment performed at the CERN PS. The simulation code is based on GEANT, FLUKA, MICAP and GAMLIB. GAMLIB is a new library including processes with gamma-rays produced in (n, gamma), (n, n'gamma) neutron reactions and is interfaced to the MICAP code. The effectiveness of different types of shielding against neutrons and gamma-rays, composed from different types of material, such as pure polyethylene, borated polyethylene, lithium-filled polyethylene, lead and iron, were compared. The results from Monte-Carlo simulations were compared to the results obtained from the experiment. The simulation results reproduce the experimental data well. This agreement supports the correctness of the simulation code used to describe the generation, spreading and absorption of neutrons (up to thermal energies) and gamma-rays in the shielding materials....

  8. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem.

    Science.gov (United States)

    Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun; Zhong, Yi-wen

    2016-01-01

    Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.

  9. AUV-Based Plume Tracking: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Awantha Jayasiri

    2016-01-01

    Full Text Available This paper presents a simulation study of an autonomous underwater vehicle (AUV navigation system operating in a GPS-denied environment. The AUV navigation method makes use of underwater transponder positioning and requires only one transponder. A multirate unscented Kalman filter is used to determine the AUV orientation and position by fusing high-rate sensor data and low-rate information. The paper also proposes a gradient-based, efficient, and adaptive novel algorithm for plume boundary tracking missions. The algorithm follows a centralized approach and it includes path optimization features based on gradient information. The proposed algorithm is implemented in simulation on the AUV-based navigation system and successful boundary tracking results are obtained.

  10. Manufacturing Resource Planning Technology Based on Genetic Programming Simulation

    Institute of Scientific and Technical Information of China (English)

    GAO Shiwen; LIAO Wenhe; GUO Yu; LIU Jinshan; SU Yan

    2009-01-01

    Network-based manufacturing is a kind of distributed system, which enables manufacturers to finish production tasks as well as to grasp the opportunities in the market, even if manufacturing resources are insufficient. One of the main problems in network-based manufacturing is the allocation of resources and the assignment of tasks rationally, according to flexible resource distribution. The mapping rules and relations between production techniques and resources are proposed, followed by the definition of the resource unit. Ultimately, the genetic programming method for the optimization of the manufacturing system is put forward. A set of software for the optimization system of simulation process using genetic programming techniques has been developed, and the problems of manufacturing resource planning in network-based manufacturing are solved with the simulation of optimizing methods by genetic programming. The optimum proposal of hardware planning, selection of company and scheduling will be obtained in theory to help company managers in scientific decision-making.

  11. SIMULATED ANNEALING BASED POLYNOMIAL TIME QOS ROUTING ALGORITHM FOR MANETS

    Institute of Scientific and Technical Information of China (English)

    Liu Lianggui; Feng Guangzeng

    2006-01-01

    Multi-constrained Quality-of-Service (QoS) routing is a big challenge for Mobile Ad hoc Networks (MANETs) where the topology may change constantly. In this paper a novel QoS Routing Algorithm based on Simulated Annealing (SA_RA) is proposed. This algorithm first uses an energy function to translate multiple QoS weights into a single mixed metric and then seeks to find a feasible path by simulated annealing. The paper outlines simulated annealing algorithm and analyzes the problems met when we apply it to Qos Routing (QoSR) in MANETs. Theoretical analysis and experiment results demonstrate that the proposed method is an effective approximation algorithms showing better performance than the other pertinent algorithm in seeking the (approximate) optimal configuration within a period of polynomial time.

  12. A CUDA based parallel multi-phase oil reservoir simulator

    Science.gov (United States)

    Zaza, Ayham; Awotunde, Abeeb A.; Fairag, Faisal A.; Al-Mouhamed, Mayez A.

    2016-09-01

    Forward Reservoir Simulation (FRS) is a challenging process that models fluid flow and mass transfer in porous media to draw conclusions about the behavior of certain flow variables and well responses. Besides the operational cost associated with matrix assembly, FRS repeatedly solves huge and computationally expensive sparse, ill-conditioned and unsymmetrical linear system. Moreover, as the computation for practical reservoir dimensions lasts for long times, speeding up the process by taking advantage of parallel platforms is indispensable. By considering the state of art advances in massively parallel computing and the accompanying parallel architecture, this work aims primarily at developing a CUDA-based parallel simulator for oil reservoir. In addition to the initial reported 33 times speed gain compared to the serial version, running experiments showed that BiCGSTAB is a stable and fast solver which could be incorporated in such simulations instead of the more expensive, storage demanding and usually utilized GMRES.

  13. Particle-based simulations of self-motile suspensions

    CERN Document Server

    Hinz, Denis F; Kim, Tae-Yeon; Fried, Eliot

    2013-01-01

    A simple model for simulating flows of active suspensions is investigated. The approach is based on dissipative particle dynamics. While the model is potentially applicable to a wide range of self-propelled particle systems, the specific class of self-motile bacterial suspensions is considered as a modeling scenario. To mimic the rod-like geometry of a bacterium, two dissipative particle dynamics particles are connected by a stiff harmonic spring to form an aggregate dissipative particle dynamics molecule. Bacterial motility is modeled through a constant self-propulsion force applied along the axis of each such aggregate molecule. The model accounts for hydrodynamic interactions between self-propelled agents through the pairwise dissipative interactions conventional to dissipative particle dynamics. Numerical simulations are performed using a customized version of the open-source LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) software package. Detailed studies of the influence of agent con...

  14. Tools for evaluating team performance in simulation-based training.

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-10-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine.

  15. Tools for evaluating team performance in simulation-based training

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-01-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine. PMID:21063558

  16. A Simulation Base Investigation of High Latency Space Systems Operations

    Science.gov (United States)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael

    2017-01-01

    NASA's human space program has developed considerable experience with near Earth space operations. Although NASA has experience with deep space robotic missions, NASA has little substantive experience with human deep space operations. Even in the Apollo program, the missions lasted only a few weeks and the communication latencies were on the order of seconds. Human missions beyond the relatively close confines of the Earth-Moon system will involve missions with durations measured in months and communications latencies measured in minutes. To minimize crew risk and to maximize mission success, NASA needs to develop a better understanding of the implications of these types of mission durations and communication latencies on vehicle design, mission design and flight controller interaction with the crew. To begin to address these needs, NASA performed a study using a physics-based subsystem simulation to investigate the interactions between spacecraft crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation, built with a subsystem modeling tool developed at NASA's Johnson Space Center, models the life support system of a Mars transit vehicle. The simulation contains models of the cabin atmosphere and pressure control system, electrical power system, drinking and waste water systems, internal and external thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the vehicle subsystems; 2) a mission control center interface with data transport delays up to 15 minutes each way; 3) a real-time simulation test conductor interface that can be use to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission between July 18th and Aug 3rd of year 2016. The NEEMO

  17. Web-based VR training simulator for percutaneous rhizotomy.

    Science.gov (United States)

    Li, Y; Brodlie, K; Phillips, N

    2000-01-01

    Virtual Reality offers great potential for surgical training--yet is typically limited by the dedicated and expensive equipment required. Web-based VR has the potential to offer a much cheaper alternative, in which simulations of fundamental techniques are downloaded from a server to run within a web browser. The equipment requirement is modest--an Internet-connected PC or small workstation--and the simulation can be accessed worldwide. In a collaboration between computer scientists and neurosurgeons, we have studied the use of web-based VR to train neurosurgeons in Percutaneous Rhizotomy--a treatment for the intractable facial pain which occurs in trigeminal neuralgia. This involves the insertion of a needle so as to puncture the foramen ovale, and lesion the nerve. Our simulation uses VRML to provide a 3D visualization environment, but the work immediately exposes a key limitation of VRML for surgical simulation. VRML does not support collision detection between objects--only between viewpoint and object. Thus collision between needle and skull cannot be detected and fed back to the trainee. We have developed a novel solution in which the training simulation has linked views: a normal view, plus a view as seen from the tip of the needle. Collision detection is captured in the needle view, and fed back to the viewer. A happy consequence of this approach has been the chance to aid the trainee with this additional view from needle tip, which helps locate the foramen ovale. The technology to achieve this is Java software communicating with the VRML worlds through the External Authoring Interface (EAI). The training simulator is available on the Web, with accompanying tutorial on its use. A major advantage of web-based VR is that the techniques generalize to a whole range of surgical simulations. Thus we have been able to use exactly the same approach as described above for neurosurgery, to develop a shoulder arthroscopy simulator--where again collision detection, and

  18. HSC simulations of coal based DR in ULCORED

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, L.; Larsson, M. [Mefos Metallurgical Research Plant, Lulea (Sweden)

    2009-10-15

    The ULCORED coal based concept is simulated based on the production of syngas using existing coal gasification technology. The shifter gives the option to produce CO{sub 2}-lean H{sub 2} from coal/biomass for in plant use. Large CO{sub 2} emissions arise on site from the use of natural gas in heating ovens and from the use of electricity in EAF melting. In the case of these coal based systems, production of 'excess gas' to be used as fuel gas in various processes will reduce the CO{sub 2} emission for the total site.

  19. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  20. A virtual reality based simulator for learning nasogastric tube placement.

    Science.gov (United States)

    Choi, Kup-Sze; He, Xuejian; Chiang, Vico Chung-Lim; Deng, Zhaohong

    2015-02-01

    Nasogastric tube (NGT) placement is a common clinical procedure where a plastic tube is inserted into the stomach through the nostril for feeding or drainage. However, the placement is a blind process in which the tube may be mistakenly inserted into other locations, leading to unexpected complications or fatal incidents. The placement techniques are conventionally acquired by practising on unrealistic rubber mannequins or on humans. In this paper, a virtual reality based training simulation system is proposed to facilitate the training of NGT placement. It focuses on the simulation of tube insertion and the rendering of the feedback forces with a haptic device. A hybrid force model is developed to compute the forces analytically or numerically under different conditions, including the situations when the patient is swallowing or when the tube is buckled at the nostril. To ensure real-time interactive simulations, an offline simulation approach is adopted to obtain the relationship between the insertion depth and insertion force using a non-linear finite element method. The offline dataset is then used to generate real-time feedback forces by interpolation. The virtual training process is logged quantitatively with metrics that can be used for assessing objective performance and tracking progress. The system has been evaluated by nursing professionals. They found that the haptic feeling produced by the simulated forces is similar to their experience during real NGT insertion. The proposed system provides a new educational tool to enhance conventional training in NGT placement.

  1. Hospital occupancy and discharge strategies: a simulation-based study.

    Science.gov (United States)

    Qin, Shaowen; Thompson, Campbell; Bogomolov, Tim; Ward, Dale; Hakendorf, Paul

    2017-08-01

    Increasing demand for hospital services has resulted in more arrivals to emergency department (ED), increased admissions, and, quite often, access block and ED congestion, along with patients' dissatisfaction. Cost constraints limit an increase in the number of hospital beds, so alternative solutions need to be explored. To propose and test different discharge strategies, which, potentially, could reduce occupancy rates in the hospital, thereby improving patient flow and minimising frequency and duration of congestion episodes. We used a simulation approach using HESMAD (Hospital Event Simulation Model: Arrivals to Discharge) - a sophisticated simulation model capturing patient flow through a large Australian hospital from arrival at ED to discharge. A set of simulation experiments with a range of proposed discharge strategies was carried out. The results were tabulated, analysed and compared using common hospital occupancy indicators. Simulation results demonstrated that it is possible to reduce significantly the number of days when a hospital runs above its base bed capacity. In our case study, this reduction was from 281.5 to 22.8 days in the best scenario, and reductions within the above range under other scenarios considered. Some relatively simple strategies, such as 24-h discharge or discharge/relocation of long-staying patients, can significantly reduce overcrowding and improve hospital occupancy rates. Shortening administrative and/or some treatment processes have a smaller effect, although the latter could be easier to implement. © 2017 Royal Australasian College of Physicians.

  2. HIRESSS: a physically based slope stability simulator for HPC applications

    Directory of Open Access Journals (Sweden)

    G. Rossi

    2013-01-01

    Full Text Available HIRESSS (HIgh REsolution Slope Stability Simulator is a physically based distributed slope stability simulator for analyzing shallow landslide triggering conditions in real time and on large areas using parallel computational techniques. The physical model proposed is composed of two parts: hydrological and geotechnical. The hydrological model receives the rainfall data as dynamical input and provides the pressure head as perturbation to the geotechnical stability model that computes the factor of safety (FS in probabilistic terms. The hydrological model is based on an analytical solution of an approximated form of the Richards equation under the wet condition hypothesis and it is introduced as a modeled form of hydraulic diffusivity to improve the hydrological response. The geotechnical stability model is based on an infinite slope model that takes into account the unsaturated soil condition. During the slope stability analysis the proposed model takes into account the increase in strength and cohesion due to matric suction in unsaturated soil, where the pressure head is negative. Moreover, the soil mass variation on partially saturated soil caused by water infiltration is modeled.

    The model is then inserted into a Monte Carlo simulation, to manage the typical uncertainty in the values of the input geotechnical and hydrological parameters, which is a common weak point of deterministic models. The Monte Carlo simulation manages a probability distribution of input parameters providing results in terms of slope failure probability. The developed software uses the computational power offered by multicore and multiprocessor hardware, from modern workstations to supercomputing facilities (HPC, to achieve the simulation in reasonable runtimes, compatible with civil protection real time monitoring.

    A first test of HIRESSS in three different areas is presented to evaluate the reliability of the results and the runtime performance on

  3. Design of transient light signal simulator based on FPGA

    Science.gov (United States)

    Kang, Jing; Chen, Rong-li; Wang, Hong

    2014-11-01

    A design scheme of transient light signal simulator based on Field Programmable gate Array (FPGA) was proposed in this paper. Based on the characteristics of transient light signals and measured feature points of optical intensity signals, a fitted curve was created in MATLAB. And then the wave data was stored in a programmed memory chip AT29C1024 by using SUPERPRO programmer. The control logic was realized inside one EP3C16 FPGA chip. Data readout, data stream cache and a constant current buck regulator for powering high-brightness LEDs were all controlled by FPGA. A 12-Bit multiplying CMOS digital-to-analog converter (DAC) DAC7545 and an amplifier OPA277 were used to convert digital signals to voltage signals. A voltage-controlled current source constituted by a NPN transistor and an operational amplifier controlled LED array diming to achieve simulation of transient light signal. LM3405A, 1A Constant Current Buck Regulator for Powering LEDs, was used to simulate strong background signal in space. Experimental results showed that the scheme as a transient light signal simulator can satisfy the requests of the design stably.

  4. An Individual-based Probabilistic Model for Fish Stock Simulation

    Directory of Open Access Journals (Sweden)

    Federico Buti

    2010-08-01

    Full Text Available We define an individual-based probabilistic model of a sole (Solea solea behaviour. The individual model is given in terms of an Extended Probabilistic Discrete Timed Automaton (EPDTA, a new formalism that is introduced in the paper and that is shown to be interpretable as a Markov decision process. A given EPDTA model can be probabilistically model-checked by giving a suitable translation into syntax accepted by existing model-checkers. In order to simulate the dynamics of a given population of soles in different environmental scenarios, an agent-based simulation environment is defined in which each agent implements the behaviour of the given EPDTA model. By varying the probabilities and the characteristic functions embedded in the EPDTA model it is possible to represent different scenarios and to tune the model itself by comparing the results of the simulations with real data about the sole stock in the North Adriatic sea, available from the recent project SoleMon. The simulator is presented and made available for its adaptation to other species.

  5. Spike library based simulator for extracellular single unit neuronal signals.

    Science.gov (United States)

    Thorbergsson, P T; Jorntell, H; Bengtsson, F; Garwicz, M; Schouenborg, J; Johansson, A

    2009-01-01

    A well defined set of design criteria is of great importance in the process of designing brain machine interfaces (BMI) based on extracellular recordings with chronically implanted micro-electrode arrays in the central nervous system (CNS). In order to compare algorithms and evaluate their performance under various circumstances, ground truth about their input needs to be present. Obtaining ground truth from real data would require optimal algorithms to be used, given that those exist. This is not possible since it relies on the very algorithms that are to be evaluated. Using realistic models of the recording situation facilitates the simulation of extracellular recordings. The simulation gives access to a priori known signal characteristics such as spike times and identities. In this paper, we describe a simulator based on a library of spikes obtained from recordings in the cat cerebellum and observed statistics of neuronal behavior during spontaneous activity. The simulator has proved to be useful in the task of generating extracellular recordings with realistic background noise and known ground truth to use in the evaluation of algorithms for spike detection and sorting.

  6. Image based numerical simulation of hemodynamics in a intracranial aneurysm

    Science.gov (United States)

    Le, Trung; Ge, Liang; Sotiropoulos, Fotis; Kallmes, David; Cloft, Harry; Lewis, Debra; Dai, Daying; Ding, Yonghong; Kadirvel, Ramanathan

    2007-11-01

    Image-based numerical simulations of hemodynamics in a intracranial aneurysm are carried out. The numerical solver based on CURVIB (curvilinear grid/immersed boundary method) approach developed in Ge and Sotiropoulos, JCP 2007 is used to simulate the blood flow. A curvilinear grid system that gradually follows the curved geometry of artery wall and consists of approximately 5M grid nodes is constructed as the background grid system and the boundaries of the investigated artery and aneurysm are treated as immersed boundaries. The surface geometry of aneurysm wall is reconstructed from an angiography study of an aneurysm formed on the common carotid artery (CCA) of a rabbit and discretized with triangular meshes. At the inlet a physiological flow waveform is specified and direct numerical simulations are used to simulate the blood flow. Very rich vortical dynamics is observed within the aneurysm area, with a ring like vortex sheds from the proximal side of aneurysm, develops and impinge onto the distal side of the aneurysm as flow develops, and destructs into smaller vortices during later cardiac cycle. This work was supported in part by the University of Minnesota Supercomputing Institute.

  7. Positron source investigation by using CLIC drive beam for Linac-LHC based e{sup +}p collider

    Energy Technology Data Exchange (ETDEWEB)

    Arikan, Ertan [Department of Physics, Faculty of Art and Sciences, Nigde University, Nigde (Turkey); Aksakal, Huesnue, E-mail: aksakal@cern.ch [Department of Physics, Faculty of Art and Sciences, Nigde University, Nigde (Turkey)

    2012-08-11

    Three different methods which are alternately conventional, Compton backscattering and Undulator based methods employed for the production of positrons. The positrons to be used for e{sup +}p collisions in a Linac-LHC (Large Hadron Collider) based collider have been studied. The number of produced positrons as a function of drive beam energy and optimum target thickness has been determined. Three different targets have been used as a source investigation which are W{sub 75}-Ir{sub 25}, W{sub 75}-Ta{sub 25}, and W{sub 75}-Re{sub 25} for three methods. Estimated number of the positrons has been performed with FLUKA simulation code. Then, these produced positrons are used for following Adiabatic matching device (AMD) and capture efficiency is determined. Then e{sup +}p collider luminosity corresponding to the methods mentioned above have been calculated by CAIN code.

  8. Return Migration After Brain Drain: An Agent Based Simulation Approach

    CERN Document Server

    Biondo, A E; Rapisarda, A

    2012-01-01

    The Brain Drain phenomenon is particularly heterogeneous and is characterized by peculiar specifications. It influences the economic fundamentals of both the country of origin and the host one in terms of human capital accumulation. Here, the brain drain is considered from a microeconomic perspective: more precisely we focus on the individual rational decision to return, referring it to the social capital owned by the worker. The presented model, restricted to the case of academic personnel, compares utility levels to justify agent's migration conduct and to simulate several scenarios with a NetLogo agent based model. In particular, we developed a simulation framework based on two fundamental individual features, i.e. risk aversion and initial expectation, which characterize the dynamics of different agents according to the random evolution of their personal social networks. Our main result is that, according to the value of risk aversion and initial expectation, the probability of return migration depends on...

  9. Disaster Rescue Simulation based on Complex Adaptive Theory

    Directory of Open Access Journals (Sweden)

    Feng Jiang

    2013-05-01

    Full Text Available Disaster rescue is one of the key measures of disaster reduction. The rescue process is a complex process with the characteristics of large scale, complicate structure, non-linear. It is hard to describe and analyze them with traditional methods. Based on complex adaptive theory, this paper analyzes the complex adaptation of the rescue process from seven features: aggregation, nonlinearity, mobility, diversity, tagging, internal model and building block. With the support of Repast platform, an agent-based model including rescue agents and victim agents was proposed. Moreover, two simulations with different parameters are employed to examine the feasibility of the model. As a result, the proposed model has been shown that it is efficient in dealing with the disaster rescue simulation and can provide the reference for making decisions.

  10. Simulation-based education and performance assessments for pediatric surgeons.

    Science.gov (United States)

    Barsness, Katherine

    2014-08-01

    Education in the knowledge, skills, and attitudes necessary for a surgeon to perform at an expert level in the operating room, and beyond, must address all potential cognitive and technical performance gaps, professionalism and personal behaviors, and effective team communication. Educational strategies should also seek to replicate the stressors and distractions that might occur during a high-risk operation or critical care event. Finally, education cannot remain fixed in an apprenticeship model of "See one, do one, teach one," whereby patients are exposed to the risk of harm inherent to any learning curve. The majority of these educational goals can be achieved with the addition of simulation-based education (SBE) as a valuable adjunct to traditional training methods. This article will review relevant principles of SBE, explore currently available simulation-based educational tools for pediatric surgeons, and finally make projections for the future of SBE and performance assessments for pediatric surgeons.

  11. Potential application of particle based simulations in reservoir security management

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In order to model the movement progress in case of risks such as dam collapse and coastal inundation, particle-based simulation methods, including the discrete-element method and smoothed particle hydrodynamics, which have specific advantages in modeling complex three-dimensional environmental fluid and particulate flows, are adopted as an effective way to illustrate environmental applications possibly happening in the real world. The theory of these methods and their relative advantages compared with tradi...

  12. SPH-based simulation of multi-material asteroid collisions

    CERN Document Server

    Maindl, Thomas I; Speith, Roland; Süli, Áron; Forgács-Dajka, Emese; Dvorak, Rudolf

    2013-01-01

    We give a brief introduction to smoothed particle hydrodynamics methods for continuum mechanics. Specifically, we present our 3D SPH code to simulate and analyze collisions of asteroids consisting of two types of material: basaltic rock and ice. We consider effects like brittle failure, fragmentation, and merging in different impact scenarios. After validating our code against previously published results we present first collision results based on measured values for the Weibull flaw distribution parameters of basalt.

  13. Data-base driven graphics animation and simulation system

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, H.D.; Curtis, J.N.

    1985-01-01

    Most attempts at the graphics animation of data involve rather large and expensive development of problem-specific systems. This paper discusses a general graphics animation system created to be a tool for the design of a wide variety of animated simulations. By using relational data base storage of graphics and control information, considerable flexibility in the design and development of animated displays is achieved.

  14. T-Algorithm-Based Logic Simulation on Distributed Systems

    OpenAIRE

    Sundaram, S; Patnaik, LM

    1992-01-01

    Increase in the complexity of VLSI digital circuit it sign demands faster logic simulation techniques than those currently available. One of the ways of speeding up existing logic simulataon algorithms is by exploiting the inherent parallelism an the sequentaal versaon. In this paper, we explore the possibility of mapping a T-algoriihm based logac samulataon algorithm onto a cluster of workstation interconnected by an ethernet. The set of gates at a particular level as partitioned by the hias...

  15. Constraint-based Ground contact handling in Humanoid Robotics Simulation

    OpenAIRE

    Martin Moraud, Eduardo; Hale, Joshua G.; Cheng, Gordon

    2008-01-01

    International audience; This paper presents a method for resolving contact in dynamic simulations of articulated figures. It is intended for humanoids with polygonal feet and incorporates Coulomb friction exactly. The proposed technique is based on a constraint selection paradigm. Its implementation offers an exact mode which guarantees correct behavior, as well as an efficiency optimized mode which sacrifices accuracy for a tightly bounded computational burden, thus facilitating batch simula...

  16. Designing and simulating a nitinol-based micro ejector

    Directory of Open Access Journals (Sweden)

    Yesid Mora Sierra

    2012-04-01

    Full Text Available This paper describes pico-droplet ejector design and simulation. The actuation system was based on two interconnected nitinol membranes’ shape memory effect. Ejected volume was 12pL and it operated at 30°C to 64°C. Ejecting excitation voltage was 12V and the ejecting energy required by actuator operation was 26µJ per drop. These pico-liter ejectors could have applications in making, lubricating and cooling integrated circuits.

  17. CONSTRUCTION COST INTEGRATED CONTROL BASED ON COMPUTER SIMULATION

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Construction cost control is a complex system engineering. Thetraditional controlling method cannot dynamically control in advance the construction cost because of its hysteresis. This paper proposes a computer simulation based construction cost integrated control method, which combines the cost with PERT systematically, so that the construction cost can be predicted and optimized systematically and effectively. The new method overcomes the hysteresis of the traditional systems, and is a distinct improvement over them in effect and practicality.

  18. Learning from physics-based earthquake simulators: a minimal approach

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  19. MRISIMUL: a GPU-based parallel approach to MRI simulations.

    Science.gov (United States)

    Xanthis, Christos G; Venetis, Ioannis E; Chalkias, A V; Aletras, Anthony H

    2014-03-01

    A new step-by-step comprehensive MR physics simulator (MRISIMUL) of the Bloch equations is presented. The aim was to develop a magnetic resonance imaging (MRI) simulator that makes no assumptions with respect to the underlying pulse sequence and also allows for complex large-scale analysis on a single computer without requiring simplifications of the MRI model. We hypothesized that such a simulation platform could be developed with parallel acceleration of the executable core within the graphic processing unit (GPU) environment. MRISIMUL integrates realistic aspects of the MRI experiment from signal generation to image formation and solves the entire complex problem for densely spaced isochromats and for a densely spaced time axis. The simulation platform was developed in MATLAB whereas the computationally demanding core services were developed in CUDA-C. The MRISIMUL simulator imaged three different computer models: a user-defined phantom, a human brain model and a human heart model. The high computational power of GPU-based simulations was compared against other computer configurations. A speedup of about 228 times was achieved when compared to serially executed C-code on the CPU whereas a speedup between 31 to 115 times was achieved when compared to the OpenMP parallel executed C-code on the CPU, depending on the number of threads used in multithreading (2-8 threads). The high performance of MRISIMUL allows its application in large-scale analysis and can bring the computational power of a supercomputer or a large computer cluster to a single GPU personal computer.

  20. COEL: A Cloud-based Reaction Network Simulator

    Directory of Open Access Journals (Sweden)

    Peter eBanda

    2016-04-01

    Full Text Available Chemical Reaction Networks (CRNs are a formalism to describe the macroscopic behavior of chemical systems. We introduce COEL, a web- and cloud-based CRN simulation framework that does not require a local installation, runs simulations on a large computational grid, provides reliable database storage, and offers a visually pleasing and intuitive user interface. We present an overview of the underlying software, the technologies, and the main architectural approaches employed. Some of COEL's key features include ODE-based simulations of CRNs and multicompartment reaction networks with rich interaction options, a built-in plotting engine, automatic DNA-strand displacement transformation and visualization, SBML/Octave/Matlab export, and a built-in genetic-algorithm-based optimization toolbox for rate constants.COEL is an open-source project hosted on GitHub (http://dx.doi.org/10.5281/zenodo.46544, which allows interested research groups to deploy it on their own sever. Regular users can simply use the web instance at no cost at http://coel-sim.org. The framework is ideally suited for a collaborative use in both research and education.

  1. Performance and perspectives of the diamond based Beam Condition Monitor for beam loss monitoring at CMS

    CERN Document Server

    AUTHOR|(CDS)2080862

    2015-01-01

    At CMS, a beam loss monitoring system is operated to protect the silicon detectors from high particle rates, arising from intense beam loss events. As detectors, poly-crystalline CVD diamond sensors are placed around the beam pipe at several locations inside CMS. In case of extremely high detector currents, the LHC beams are automatically extracted from the LHC rings.Diamond is the detector material of choice due to its radiation hardness. Predictions of the detector lifetime were made based on FLUKA monte-carlo simulations and irradiation test results from the RD42 collaboration, which attested no significant radiation damage over several years.During the LHC operational Run1 (2010 â?? 2013), the detector efficiencies were monitored. A signal decrease of about 50 times stronger than expectations was observed in the in-situ radiation environment. Electric field deformations due to charge carriers, trapped in radiation induced lattice defects, are responsible for this signal decrease. This so-called polarizat...

  2. A Simulation Based Investigation of High Latency Space Systems Operations

    Science.gov (United States)

    Li, Zu Qun; Moore, Michael; Bielski, Paul; Crues, Edwin Z.

    2017-01-01

    This study was the first in a series of planned tests to use physics-based subsystem simulations to investigate the interactions between a spacecraft's crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation models the life support system of a deep space habitat. It contains models of an environmental control and life support system, an electrical power system, an active thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the subsystems; 2) a mission control center interface with data transport delays up to 15 minute each way; and 3) a real-time simulation test conductor interface used to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission. The NEEMO crew and ground support team performed a number of relevant deep space mission scenarios that included both nominal activities and activities with system malfunctions. While this initial test sequence was focused on test infrastructure and procedures development, the data collected in the study already indicate that long communication delays have notable impacts on the operation of deep space systems. For future human missions beyond cis-lunar, NASA will need to design systems and support tools to meet these challenges. These will be used to train the crew to handle critical malfunctions on their own, to predict malfunctions and assist with vehicle operations. Subsequent more detailed and involved studies will be conducted to continue advancing NASA's understanding of space systems operations across long communications delays.

  3. [Simulation of TDLAS direct absorption based on HITRAN database].

    Science.gov (United States)

    Qi, Ru-birn; He, Shu-kai; Li, Xin-tian; Wang, Xian-zhong

    2015-01-01

    Simulating of the direct absorption TDLAS spectrum can help to comprehend the process of the absorbing and understand the influence on the absorption signal with each physical parameter. Firstly, the basic theory and algorithm of direct absorption TDLAS is studied and analyzed thoroughly, through giving the expressions and calculating steps of parameters based on Lambert-Beer's law, such as line intensity, absorption cross sections, concentration, line shape and gas total partition functions. The process of direct absorption TDLAS is simulated using MATLAB programs based on HITRAN spectra database, with which the absorptions under a certain temperature, pressure, concentration and other conditions were calculated, Water vapor is selected as the target gas, the absorptions of which under every line shapes were simulated. The results were compared with that of the commercial simulation software, Hitran-PC, which showed that, the deviation under Lorentz line shape is less than 0. 5%, and that under Gauss line shape is less than 2. 5%, while under Voigt line shape it is less than 1%. It verified that the algorithm and results of this work are correct and accurate. The absorption of H2O in v2 + v3 band under different pressure and temperature is also simulated. In low pressure range, the Doppler broadening dominant, so the line width changes little with varied.pressure, while the line peak increases with rising pressure. In high pressure range, the collision broadening dominant, so the line width changes wider with increasing pressure, while the line peak approaches to a constant value with rising pressure. And finally, the temperature correction curve in atmosphere detection is also given. The results of this work offer the reference and instruction for the application of TDLAS direct absorption.

  4. Ship electric propulsion simulator based on networking technology

    Science.gov (United States)

    Zheng, Huayao; Huang, Xuewu; Chen, Jutao; Lu, Binquan

    2006-11-01

    According the new ship building tense, a novel electric propulsion simulator (EPS) had been developed in Marine Simulation Center of SMU. The architecture, software function and FCS network technology of EPS and integrated power system (IPS) were described. In allusion to the POD propeller in ship, a special physical model was built. The POD power was supplied from the simulative 6.6 kV Medium Voltage Main Switchboard, its control could be realized in local or remote mode. Through LAN, the simulated feature information of EPS will pass to the physical POD model, which would reflect the real thruster working status in different sea conditions. The software includes vessel-propeller math module, thruster control system, distribution and emergency integrated management, double closed loop control system, vessel static water resistance and dynamic software; instructor main control software. The monitor and control system is realized by real time data collection system and CAN bus technology. During the construction, most devices such as monitor panels and intelligent meters, are developed in lab which were based on embedded microcomputer system with CAN interface to link the network. They had also successfully used in practice and would be suitable for the future demands of digitalization ship.

  5. Physically-based, Hydrologic Simulations Driven by Three Precipitation Products

    Science.gov (United States)

    Chintalapudi, S.; Sharif, H. O.; Yeggina, S.; El Hassan, A.

    2011-12-01

    This study evaluates the model-simulated stream discharge over the Guadalupe River basin in central Texas driven by three precipitation products: the Guadalupe-Blanco River Authority (GBRA) rain gauge network, the Next Generation Weather Radar (NEXRAD) Stage ΙΙΙ precipitation product, and the Tropical Rainfall Measurement Mission (TRMM) 3B42 product. Focus will be on results from the Upper Guadalupe River sub-basin. This sub-basin is more prone to flooding due to its geological properties (thin soils, exposed bedrock, and sparse vegetation) and the impact of Balcones Escarpment on the moisture coming from the Gulf of Mexico. The physically based, distributed-parameter Gridded Surface Subsurface Hydrologic Analysis (GSSHA) hydrologic model was used to simulate the June-2002 flooding event. Simulations driven by NEXRAD Stage ΙΙΙ 15 - min precipitation yielded better results with low RMSE (88.3%), high NSE (0.6), high R2 (0.73), low RSR (0.63) and low PBIAS (-17.3%) compared to simulations driven by the other products.

  6. Particle-based simulations of self-motile suspensions

    Science.gov (United States)

    Hinz, Denis F.; Panchenko, Alexander; Kim, Tae-Yeon; Fried, Eliot

    2015-11-01

    A simple model for simulating flows of active suspensions is investigated. The approach is based on dissipative particle dynamics. While the model is potentially applicable to a wide range of self-propelled particle systems, the specific class of self-motile bacterial suspensions is considered as a modeling scenario. To mimic the rod-like geometry of a bacterium, two dissipative particle dynamics particles are connected by a stiff harmonic spring to form an aggregate dissipative particle dynamics molecule. Bacterial motility is modeled through a constant self-propulsion force applied along the axis of each such aggregate molecule. The model accounts for hydrodynamic interactions between self-propelled agents through the pairwise dissipative interactions conventional to dissipative particle dynamics. Numerical simulations are performed using a customized version of the open-source software package LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) software package. Detailed studies of the influence of agent concentration, pairwise dissipative interactions, and Stokes friction on the statistics of the system are provided. The simulations are used to explore the influence of hydrodynamic interactions in active suspensions. For high agent concentrations in combination with dominating pairwise dissipative forces, strongly correlated motion patterns and a fluid-like spectral distributions of kinetic energy are found. In contrast, systems dominated by Stokes friction exhibit weaker spatial correlations of the velocity field. These results indicate that hydrodynamic interactions may play an important role in the formation of spatially extended structures in active suspensions.

  7. Virtual tryout planning in automotive industry based on simulation metamodels

    Science.gov (United States)

    Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.

    2016-11-01

    Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.

  8. Module-based multiscale simulation of angiogenesis in skeletal muscle

    Directory of Open Access Journals (Sweden)

    Mac Gabhann Feilim

    2011-04-01

    Full Text Available Abstract Background Mathematical modeling of angiogenesis has been gaining momentum as a means to shed new light on the biological complexity underlying blood vessel growth. A variety of computational models have been developed, each focusing on different aspects of the angiogenesis process and occurring at different biological scales, ranging from the molecular to the tissue levels. Integration of models at different scales is a challenging and currently unsolved problem. Results We present an object-oriented module-based computational integration strategy to build a multiscale model of angiogenesis that links currently available models. As an example case, we use this approach to integrate modules representing microvascular blood flow, oxygen transport, vascular endothelial growth factor transport and endothelial cell behavior (sensing, migration and proliferation. Modeling methodologies in these modules include algebraic equations, partial differential equations and agent-based models with complex logical rules. We apply this integrated model to simulate exercise-induced angiogenesis in skeletal muscle. The simulation results compare capillary growth patterns between different exercise conditions for a single bout of exercise. Results demonstrate how the computational infrastructure can effectively integrate multiple modules by coordinating their connectivity and data exchange. Model parameterization offers simulation flexibility and a platform for performing sensitivity analysis. Conclusions This systems biology strategy can be applied to larger scale integration of computational models of angiogenesis in skeletal muscle, or other complex processes in other tissues under physiological and pathological conditions.

  9. Atomistic simulation of Voronoi-based coated nanoporous metals

    Science.gov (United States)

    Onur Yildiz, Yunus; Kirca, Mesut

    2017-02-01

    In this study, a new method developed for the generation of periodic atomistic models of coated and uncoated nanoporous metals (NPMs) is presented by examining the thermodynamic stability of coated nanoporous structures. The proposed method is mainly based on the Voronoi tessellation technique, which provides the ability to control cross-sectional dimension and slenderness of ligaments as well as the thickness of coating. By the utilization of the method, molecular dynamic (MD) simulations of randomly structured NPMs with coating can be performed efficiently in order to investigate their physical characteristics. In this context, for the purpose of demonstrating the functionality of the method, sample atomistic models of Au/Pt NPMs are generated and the effects of coating and porosity on the thermodynamic stability are investigated by using MD simulations. In addition to that, uniaxial tensile loading simulations are performed via MD technique to validate the nanoporous models by comparing the effective Young’s modulus values with the results from literature. Based on the results, while it is demonstrated that coating the nanoporous structures slightly decreases the structural stability causing atomistic configurational changes, it is also shown that the stability of the atomistic models is higher at lower porosities. Furthermore, adaptive common neighbour analysis is also performed to identify the stabilized atomistic structure after the coating process, which provides direct foresights for the mechanical behaviour of coated nanoporous structures.

  10. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  11. Radiation background simulation and verification at the LHC and its upgrades.

    CERN Document Server

    Dawson, I; The ATLAS collaboration

    2012-01-01

    The high collision rates at the new energy regime of the LHC gives rise to unprecedented radiation environments, especially in the inner regions of the experiments. Deleterious effects of radiation on the experiments include: damage to detectors and electronics; fake backgrounds in the selection and reconstruction of interesting physics events; single event upsets causing disruption in the data readout; radio-activation of components making access for maintenance difficult. High fidelity codes such as FLUKA and GEANT4 are necessary for simulating the complex radiation backgrounds in detail. The results can then be used for predicting detector system behaviour and performance over the lifetime of the project. In this talk the following will be covered: First the Monte Carlo tools used to simulate the radiation backgrounds will be discussed, which include the transport codes FLUKA and GEANT4, as well as the collision event generators PHOJET and PYTHIA. Examples of the predictions at the ATLAS experiment will be...

  12. Monte-Carlo simulation-based statistical modeling

    CERN Document Server

    Chen, John

    2017-01-01

    This book brings together expert researchers engaged in Monte-Carlo simulation-based statistical modeling, offering them a forum to present and discuss recent issues in methodological development as well as public health applications. It is divided into three parts, with the first providing an overview of Monte-Carlo techniques, the second focusing on missing data Monte-Carlo methods, and the third addressing Bayesian and general statistical modeling using Monte-Carlo simulations. The data and computer programs used here will also be made publicly available, allowing readers to replicate the model development and data analysis presented in each chapter, and to readily apply them in their own research. Featuring highly topical content, the book has the potential to impact model development and data analyses across a wide spectrum of fields, and to spark further research in this direction.

  13. Hierarchy-Based Team Software Process Simulation Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    According to the characteristic of Team Software Process (TSP), it adopts a hierarchy-based model combined discrete event model with system dynamics model. This model represents TSP as form of two levels, the inner level embodies the continuity of the software process, the outer embodies the software development process by phases, and the structure and principle of the model is explained in detail, then formalization description of the model is offered. At last, an example is presented to demonstrate the simulation process and result. This model can simulate team software process from various angles, supervise and predict the software process. Also it can make the management of software development become more scientific and improve the quality of software.

  14. Internet Based Simulations of Debris Dispersion of Shuttle Launch

    Science.gov (United States)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    The debris dispersion model (which dispersion model?) is so heterogeneous and interrelated with various factors, 3D graphics combined with physical models are useful in understanding the complexity of launch and range operations. Modeling and simulation in this area mainly focuses on orbital dynamics and range safety concepts, including destruct limits, telemetry and tracking, and population risk. Particle explosion modeling is the process of simulating an explosion by breaking the rocket into many pieces. The particles are scattered throughout their motion using the laws of physics eventually coming to rest. The size of the foot print explains the type of explosion and distribution of the particles. The shuttle launch and range operations in this paper are discussed based on the operations of the Kennedy Space Center, Florida, USA. Java 3D graphics provides geometric and visual content with suitable modeling behaviors of Shuttle launches.

  15. Some Results on Ethnic Conflicts Based on Evolutionary Game Simulation

    CERN Document Server

    Qin, Jun; Wu, Hongrun; Liu, Yuhang; Tong, Xiaonian; Zheng, Bojin

    2014-01-01

    The force of the ethnic separatism, essentially origining from negative effect of ethnic identity, is damaging the stability and harmony of multiethnic countries. In order to eliminate the foundation of the ethnic separatism and set up a harmonious ethnic relationship, some scholars have proposed a viewpoint: ethnic harmony could be promoted by popularizing civic identity. However, this viewpoint is discussed only from a philosophical prospective and still lack supports of scientific evidences. Because ethic group and ethnic identity are products of evolution and ethnic identity is the parochialism strategy under the perspective of game theory, this paper proposes an evolutionary game simulation model to study the relationship between civic identity and ethnic conflict based on evolutionary game theory. The simulation results indicate that: 1) the ratio of individuals with civic identity has a positive association with the frequency of ethnic conflicts; 2) ethnic conflict will not die out by killing all ethni...

  16. Edge detection based on Hodgkin-Huxley neuron model simulation.

    Science.gov (United States)

    Yedjour, Hayat; Meftah, Boudjelal; Lézoray, Olivier; Benyettou, Abdelkader

    2017-04-03

    In this paper, we propose a spiking neural network model for edge detection in images. The proposed model is biologically inspired by the mechanisms employed by natural vision systems, more specifically by the biologically fulfilled function of simple cells of the human primary visual cortex that are selective for orientation. Several aspects are studied in this model according to three characteristics: feedforward spiking neural structure; conductance-based model of the Hodgkin-Huxley neuron and Gabor receptive fields structure. A visualized map is generated using the firing rate of neurons representing the orientation map of the visual cortex area. We have simulated the proposed model on different images. Successful computer simulation results are obtained. For comparison, we have chosen five methods for edge detection. We finally evaluate and compare the performances of our model toward contour detection using a public dataset of natural images with associated contour ground truths. Experimental results show the ability and high performance of the proposed network model.

  17. Research on a didtributed simulation based on unified interface

    Institute of Scientific and Technical Information of China (English)

    孙知信; 王汝传; 王绍棣; 张钦

    2004-01-01

    With the scale of simulation system increasing rapidly, the distributed interactive simulation (DIS) faces the problem fo system scalability. One of main factors, which influence the DIS scalability, is that the lacks of standard defining interface methods, as the number of the system modules increased. This paper presents a method of defining an unified interface among different models. Then a DIS development platform based on it is established after thorough studies of the reasons that cause the DIS scalability problems. Finally we demonstrate the method by realization of an experiment DIS system in LAN environment. It validates that a practical system development can be carried out rapidly and efectively on this DIS development plaform. The experimental results show that the unified interface has general purpose for the design and implementation of DIS systems.

  18. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  19. Pump-stopping water hammer simulation based on RELAP5

    Science.gov (United States)

    Yi, W. S.; Jiang, J.; Li, D. D.; Lan, G.; Zhao, Z.

    2013-12-01

    RELAP5 was originally designed to analyze complex thermal-hydraulic interactions that occur during either postulated large or small loss-of-coolant accidents in PWRs. However, as development continued, the code was expanded to include many of the transient scenarios that might occur in thermal-hydraulic systems. The fast deceleration of the liquid results in high pressure surges, thus the kinetic energy is transformed into the potential energy, which leads to the temporary pressure increase. This phenomenon is called water hammer. Generally water hammer can occur in any thermal-hydraulic systems and it is extremely dangerous for the system when the pressure surges become considerably high. If this happens and when the pressure exceeds the critical pressure that the pipe or the fittings along the pipeline can burden, it will result in the failure of the whole pipeline integrity. The purpose of this article is to introduce the RELAP5 to the simulation and analysis of water hammer situations. Based on the knowledge of the RELAP5 code manuals and some relative documents, the authors utilize RELAP5 to set up an example of water-supply system via an impeller pump to simulate the phenomena of the pump-stopping water hammer. By the simulation of the sample case and the subsequent analysis of the results that the code has provided, we can have a better understand of the knowledge of water hammer as well as the quality of the RELAP5 code when it's used in the water-hammer fields. In the meantime, By comparing the results of the RELAP5 based model with that of other fluid-transient analysis software say, PIPENET. The authors make some conclusions about the peculiarity of RELAP5 when transplanted into water-hammer research and offer several modelling tips when use the code to simulate a water-hammer related case.

  20. Zero-Gravity Locomotion Simulators: New Ground-Based Analogs for Microgravity Exercise Simulation

    Science.gov (United States)

    Perusek, Gail P.; DeWitt, John K.; Cavanagh, Peter R.; Grodsinsky, Carlos M.; Gilkey, Kelly M.

    2007-01-01

    Maintaining health and fitness in crewmembers during space missions is essential for preserving performance for mission-critical tasks. NASA's Exercise Countermeasures Project (ECP) provides space exploration exercise hardware and monitoring requirements that lead to devices that are reliable, meet medical, vehicle, and habitat constraints, and use minimal vehicle and crew resources. ECP will also develop and validate efficient exercise prescriptions that minimize daily time needed for completion of exercise yet maximize performance for mission activities. In meeting these mission goals, NASA Glenn Research Center (Cleveland, OH, USA), in collaboration with the Cleveland Clinic (Cleveland, Ohio, USA), has developed a suite of zero-gravity locomotion simulators and associated technologies to address the need for ground-based test analog capability for simulating in-flight (microgravity) and surface (partial-gravity) exercise to advance the health and safety of astronaut crews and the next generation of space explorers. Various research areas can be explored. These include improving crew comfort during exercise, and understanding joint kinematics and muscle activation pattern differences relative to external loading mechanisms. In addition, exercise protocol and hardware optimization can be investigated, along with characterizing system dynamic response and the physiological demand associated with advanced exercise device concepts and performance of critical mission tasks for Exploration class missions. Three zero-gravity locomotion simulators are currently in use and the research focus for each will be presented. All of the devices are based on a supine subject suspension system, which simulates a reduced gravity environment by completely or partially offloading the weight of the exercising test subject s body. A platform for mounting treadmill is positioned perpendicularly to the test subject. The Cleveland Clinic Zero-g Locomotion Simulator (ZLS) utilizes a

  1. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Pawel PAWLEWSKI

    2012-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  2. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Paul-Eric DOSSOU

    2013-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  3. Simulations

    CERN Document Server

    Ngada, N M

    2015-01-01

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  4. Patch-based iterative conditional geostatistical simulation using graph cuts

    Science.gov (United States)

    Li, Xue; Mariethoz, Gregoire; Lu, DeTang; Linde, Niklas

    2016-08-01

    Training image-based geostatistical methods are increasingly popular in groundwater hydrology even if existing algorithms present limitations that often make real-world applications difficult. These limitations include a computational cost that can be prohibitive for high-resolution 3-D applications, the presence of visual artifacts in the model realizations, and a low variability between model realizations due to the limited pool of patterns available in a finite-size training image. In this paper, we address these issues by proposing an iterative patch-based algorithm which adapts a graph cuts methodology that is widely used in computer graphics. Our adapted graph cuts method optimally cuts patches of pixel values borrowed from the training image and assembles them successively, each time accounting for the information of previously stitched patches. The initial simulation result might display artifacts, which are identified as regions of high cost. These artifacts are reduced by iteratively placing new patches in high-cost regions. In contrast to most patch-based algorithms, the proposed scheme can also efficiently address point conditioning. An advantage of the method is that the cut process results in the creation of new patterns that are not present in the training image, thereby increasing pattern variability. To quantify this effect, a new measure of variability is developed, the merging index, quantifies the pattern variability in the realizations with respect to the training image. A series of sensitivity analyses demonstrates the stability of the proposed graph cuts approach, which produces satisfying simulations for a wide range of parameters values. Applications to 2-D and 3-D cases are compared to state-of-the-art multiple-point methods. The results show that the proposed approach obtains significant speedups and increases variability between realizations. Connectivity functions applied to 2-D models transport simulations in 3-D models are used to

  5. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Shi-hua Zhan

    2016-01-01

    Full Text Available Simulated annealing (SA algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters’ setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA algorithm to solve traveling salesman problem (TSP. LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.

  6. Numerical simulation on development of a SAW based biosensor

    Science.gov (United States)

    Ten, S. T.; Hashim, U.; Sudin, A.; Arshad, M. K. Md.; Liu, W. W.; Foo, K. L.; Voon, C. H.; Wee, F. H.; Lee, Y. S.; Salleh, N. H. M.; Nazwa, T.

    2016-07-01

    Surface acoustic waves can be generated at the free surface of an elastic solid. For this property, surface acoustic based devices were initially developed for the telecommunication purpose such as signal filters and resonators. The acoustic energy is strongly confined on the surface of the surface acoustic waves (SAW) based devices and consequent their ultra-sensitivity to the surface perturbation. This has made SAW permits the highly sensitive detection of utterly diminutive charges on the surface. Hence, SAW based devices have been modified to be sensors for the mass loading effect on its surface and this is perfectly for biosensor development. There have been a lot of complicated theoretical models for the SAW devices development since 1960 as signal filters and resonators such as from delta function model, equivalent circuit model, to the current SAW models such as coupling-of-modes (COM) model, P-matrix model and Computer Simulation Technology Studio Suite (CST). However, these models are more tailored for the telecommunication application purposes and very complex. Thus, this paper presents the finite element analysis (FEA) modeling, COMSOL Multiphysics which is used to study the mass loading effect on SAW which will be used as biosensor. This study managed to simulate the mass loading sensitivity of 8.71×107 kHz/g mm-2.

  7. Agent-based simulation of electricity markets. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Sensfuss, F.; Ragwitz, M. [Fraunhofer-Institut fuer Systemtechnik und Innovationsforschung (ISI), Karlsruhe (Germany); Genoese, M.; Moest, D. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Industriebetriebslehre und Industrielle Produktion

    2007-07-01

    Liberalisation, climate policy and promotion of renewable energy are challenges to players of the electricity sector in many countries. Policy makers have to con-sider issues like market power, bounded rationality of players and the appear-ance of fluctuating energy sources in order to provide adequate legislation. Fur-thermore the interactions between markets and environmental policy instru-ments become an issue of increasing importance. A promising approach for the scientific analysis of these developments is the field of agent-based simulation. The goal of this article is to provide an overview of the current work applying this methodology to the analysis of electricity markets. (orig.)

  8. Fuzzy-based simulation of real color blindness.

    Science.gov (United States)

    Lee, Jinmi; dos Santos, Wellington P

    2010-01-01

    About 8% of men are affected by color blindness. That population is at a disadvantage since they cannot perceive a substantial amount of the visual information. This work presents two computational tools developed to assist color blind people. The first one tests color blindness and assess its severity. The second tool is based on Fuzzy Logic, and implements a method proposed to simulate real red and green color blindness in order to generate synthetic cases of color vision disturbance in a statistically significant amount. Our purpose is to develop correction tools and obtain a deeper understanding of the accessibility problems faced by people with chromatic visual impairment.

  9. Modeling and Simulation Framework for Flow-Based Microfluidic Biochips

    DEFF Research Database (Denmark)

    Schmidt, Morten Foged; Minhass, Wajid Hassan; Pop, Paul

    2013-01-01

    Microfluidic biochips are replacing the conventional biochemical analyzers and are able to integrate the necessary functions for biochemical analysis on-chip. In this paper we are interested in flow-based biochips, in which the fluidic flow is manipulated using integrated microvalves. By combining...... and error prone. In this paper, we present an Integrated Development Environment (IDE), which addresses (i) schematic capture of the biochip architecture and biochemical application, (ii) logic simulation of an application running on a biochip, and is able to integrate the high level synthesis tasks we have...

  10. Simulation of Vertical Planetary Mill Based on Virtual Prototyping

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The mechanical model of vertical planetary mill is set up, whose dynamic and kinetic characteristics are described as well. Based on the analysis of system dynamics of vertical planetary mill, virtual prototyping technology is applied in the simulation of this mill. The development of virtual prototype of equipment, virtual test and optimization of virtual prototype are stated in detail. Some useful conclusions which have theoretical meaning for the manufacturing of vertical planetary mill have been obtained. Furthermore, it is pointed out that virtual prototyping technology shows great advantage and is bound to become a main method of developing product in the future.

  11. Neural-Based Models of Semiconductor Devices for SPICE Simulator

    Directory of Open Access Journals (Sweden)

    Hanene B. Hammouda

    2008-01-01

    Full Text Available The paper addresses a simple and fast new approach to implement Artificial Neural Networks (ANN models for the MOS transistor into SPICE. The proposed approach involves two steps, the modeling phase of the device by NN providing its input/output patterns, and the SPICE implementation process of the resulting model. Using the Taylor series expansion, a neural based small-signal model is derived. The reliability of our approach is validated through simulations of some circuits in DC and small-signal analyses.

  12. Agent-based modeling to simulate the dengue spread

    Science.gov (United States)

    Deng, Chengbin; Tao, Haiyan; Ye, Zhiwei

    2008-10-01

    In this paper, we introduce a novel method ABM in simulating the unique process for the dengue spread. Dengue is an acute infectious disease with a long history of over 200 years. Unlike the diseases that can be transmitted directly from person to person, dengue spreads through a must vector of mosquitoes. There is still no any special effective medicine and vaccine for dengue up till now. The best way to prevent dengue spread is to take precautions beforehand. Thus, it is crucial to detect and study the dynamic process of dengue spread that closely relates to human-environment interactions where Agent-Based Modeling (ABM) effectively works. The model attempts to simulate the dengue spread in a more realistic way in the bottom-up way, and to overcome the limitation of ABM, namely overlooking the influence of geographic and environmental factors. Considering the influence of environment, Aedes aegypti ecology and other epidemiological characteristics of dengue spread, ABM can be regarded as a useful way to simulate the whole process so as to disclose the essence of the evolution of dengue spread.

  13. A SIMULATION OF CONTRACT FARMING USING AGENT BASED MODELING

    Directory of Open Access Journals (Sweden)

    Yuanita Handayati

    2016-12-01

    Full Text Available This study aims to simulate the effects of contract farming and farmer commitment to contract farming on supply chain performance by using agent based modeling as a methodology. Supply chain performance is represented by profits and service levels. The simulation results indicate that farmers should pay attention to customer requirements and plan their agricultural activities in order to fulfill these requirements. Contract farming helps farmers deal with demand and price uncertainties. We also find that farmer commitment is crucial to fulfilling contract requirements. This study contributes to this field from a conceptual as well as a practical point of view. From the conceptual point of view, our simulation results show that different levels of farmer commitment have an impact on farmer performance when implementing contract farming. From a practical point of view, the uncertainty faced by farmers and the market can be managed by implementing cultivation and harvesting scheduling, information sharing, and collective learning as ways of committing to contract farming.

  14. The numerical simulation based on CFD of hydraulic turbine pump

    Science.gov (United States)

    Duan, X. H.; Kong, F. Y.; Liu, Y. Y.; Zhao, R. J.; Hu, Q. L.

    2016-05-01

    As the functions of hydraulic turbine pump including self-adjusting and compensation with each other, it is far-reaching to analyze its internal flow by the numerical simulation based on CFD, mainly including the pressure field and the velocity field in hydraulic turbine and pump.The three-dimensional models of hydraulic turbine pump are made by Pro/Engineer software;the internal flow fields in hydraulic turbine and pump are simulated numerically by CFX ANSYS software. According to the results of the numerical simulation in design condition, the pressure field and the velocity field in hydraulic turbine and pump are analyzed respectively .The findings show that the static pressure decreases systematically and the pressure gradient is obvious in flow area of hydraulic turbine; the static pressure increases gradually in pump. The flow trace is regular in suction chamber and flume without spiral trace. However, there are irregular traces in the turbine runner channels which contrary to that in flow area of impeller. Most of traces in the flow area of draft tube are spiral.

  15. An event-based hydrologic simulation model for bioretention systems.

    Science.gov (United States)

    Roy-Poirier, A; Filion, Y; Champagne, P

    2015-01-01

    Bioretention systems are designed to treat stormwater and provide attenuated drainage between storms. Bioretention has shown great potential at reducing the volume and improving the quality of stormwater. This study introduces the bioretention hydrologic model (BHM), a one-dimensional model that simulates the hydrologic response of a bioretention system over the duration of a storm event. BHM is based on the RECARGA model, but has been adapted for improved accuracy and integration of pollutant transport models. BHM contains four completely-mixed layers and accounts for evapotranspiration, overflow, exfiltration to native soils and underdrain discharge. Model results were evaluated against field data collected over 10 storm events. Simulated flows were particularly sensitive to antecedent water content and drainage parameters of bioretention soils, which were calibrated through an optimisation algorithm. Temporal disparity was observed between simulated and measured flows, which was attributed to preferential flow paths formed within the soil matrix of the field system. Modelling results suggest that soil water storage is the most important short-term hydrologic process in bioretention, with exfiltration having the potential to be significant in native soils with sufficient permeability.

  16. High viscosity fluid simulation using particle-based method

    KAUST Repository

    Chang, Yuanzhang

    2011-03-01

    We present a new particle-based method for high viscosity fluid simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke\\'s law, is included in the traditional Navier-Stokes equation to simulate the movements of the high viscosity fluids. Benefiting from the Lagrangian nature of Smoothed Particle Hydrodynamics method, large flow deformation can be well handled easily and naturally. In addition, in order to eliminate the particle deficiency problem near the boundary, ghost particles are employed to enforce the solid boundary condition. Compared with Finite Element Methods with complicated and time-consuming remeshing operations, our method is much more straightforward to implement. Moreover, our method doesn\\'t need to store and compare to an initial rest state. The experimental results show that the proposed method is effective and efficient to handle the movements of highly viscous flows, and a large variety of different kinds of fluid behaviors can be well simulated by adjusting just one parameter. © 2011 IEEE.

  17. Simulation of Neurocomputing Based on Photophobic Reactions of Euglena: Toward Microbe-Based Neural Network Computing

    Science.gov (United States)

    Ozasa, Kazunari; Aono, Masashi; Maeda, Mizuo; Hara, Masahiko

    In order to develop an adaptive computing system, we investigate microscopic optical feedback to a group of microbes (Euglena gracilis in this study) with a neural network algorithm, expecting that the unique characteristics of microbes, especially their strategies to survive/adapt against unfavorable environmental stimuli, will explicitly determine the temporal evolution of the microbe-based feedback system. The photophobic reactions of Euglena are extracted from experiments, and built in the Monte-Carlo simulation of a microbe-based neurocomputing. The simulation revealed a good performance of Euglena-based neurocomputing. Dynamic transition among the solutions is discussed from the viewpoint of feedback instability.

  18. LISP based simulation generators for modeling complex space processes

    Science.gov (United States)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  19. A fault and seismicity based composite simulation in northern California

    Directory of Open Access Journals (Sweden)

    M. B. Yıkılmaz

    2011-12-01

    Full Text Available We generate synthetic catalogs of seismicity in northern California using a composite simulation. The basis of the simulation is the fault based "Virtual California" (VC earthquake simulator. Back-slip velocities and mean recurrence intervals are specified on model strike-slip faults. A catalog of characteristic earthquakes is generated for a period of 100 000 yr. These earthquakes are predominantly in the range M = 6 to M = 8, but do not follow Gutenberg-Richter (GR scaling at lower magnitudes. In order to model seismicity on unmapped faults we introduce background seismicity which occurs randomly in time with GR scaling and is spatially associated with the VC model faults. These earthquakes fill in the GR scaling down to M = 4 (the smallest earthquakes modeled. The rate of background seismicity is constrained by the observed rate of occurrence of M > 4 earthquakes in northern California. These earthquakes are then used to drive the BASS (branching aftershock sequence model of aftershock occurrence. The BASS model is the self-similar limit of the ETAS (epidemic type aftershock sequence model. Families of aftershocks are generated following each Virtual California and background main shock. In the simulations the rate of occurrence of aftershocks is essentially equal to the rate of occurrence of main shocks in the magnitude range 4 < M < 7. We generate frequency-magnitude and recurrence interval statistics both regionally and fault specific. We compare our modeled rates of seismicity and spatial variability with observations.

  20. A particle based simulation model for glacier dynamics

    Directory of Open Access Journals (Sweden)

    J. A. Åström

    2013-10-01

    Full Text Available A particle-based computer simulation model was developed for investigating the dynamics of glaciers. In the model, large ice bodies are made of discrete elastic particles which are bound together by massless elastic beams. These beams can break, which induces brittle behaviour. At loads below fracture, beams may also break and reform with small probabilities to incorporate slowly deforming viscous behaviour in the model. This model has the advantage that it can simulate important physical processes such as ice calving and fracturing in a more realistic way than traditional continuum models. For benchmarking purposes the deformation of an ice block on a slip-free surface was compared to that of a similar block simulated with a Finite Element full-Stokes continuum model. Two simulations were performed: (1 calving of an ice block partially supported in water, similar to a grounded marine glacier terminus, and (2 fracturing of an ice block on an inclined plane of varying basal friction, which could represent transition to fast flow or surging. Despite several approximations, including restriction to two-dimensions and simplified water-ice interaction, the model was able to reproduce the size distributions of the debris observed in calving, which may be approximated by universal scaling laws. On a moderate slope, a large ice block was stable and quiescent as long as there was enough of friction against the substrate. For a critical length of frictional contact, global sliding began, and the model block disintegrated in a manner suggestive of a surging glacier. In this case the fragment size distribution produced was typical of a grinding process.

  1. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  2. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  3. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    Science.gov (United States)

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  4. A Simulation-Based LED Design Project in Photonics Instruction Based on Industry-University Collaboration

    Science.gov (United States)

    Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.

    2011-01-01

    In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…

  5. Preliminary FLUKA study of the proposed PSNF installation - A description of the implemented geometry with an analysis of some operational aspects

    CERN Document Server

    Calviani, M; Sala, P; Vlachoudis, V; CERN. Geneva. ATS Department

    2011-01-01

    A recent experimental proposal in the form of a Memorandum has been discussed at the SPSC (SPS and PS experiments Committee) session of April 2011, with the objective of investigating anomalous nu_mu --> nu_e neutrino oscillations; this proposal has followed a letter of intent published in 2009. The proposed experiment would be a short-baseline neutrino experiment with two detectors placed at 120 m and 850 m from the secondary production target. In order to achieve the required beam parameters, it has been proposed to reactivate the discontinued PS neutrino facility, which was operational in the early 80s, located in the TT7 tunnel. The present note describes the implementation of the secondary beam production elements of the proposed PS Neutrino Facility within the FLUKA Monte Carlo code as well as the complete infrastructure of the TT7 tunnel; the note also contains a respective analysis of some critical aspects related to energy deposition in the beam line, of the ambient dose rate equivalent in public a...

  6. SAR Automatic Target Recognition Based on Numerical Scattering Simulation and Model-based Matching

    Directory of Open Access Journals (Sweden)

    Zhou Yu

    2015-12-01

    Full Text Available This study proposes a model-based Synthetic Aperture Radar (SAR automatic target recognition algorithm. Scattering is computed offline using the laboratory-developed Bidirectional Analytic Ray Tracing software and the same system parameter settings as the Moving and Stationary Target Acquisition and Recognition (MSTAR datasets. SAR images are then created by simulated electromagnetic scattering data. Shape features are extracted from the measured and simulated images, and then, matches are searched. The algorithm is verified using three types of targets from MSTAR data and simulated SAR images, and it is shown that the proposed approach is fast and easy to implement with high accuracy.

  7. Agent-based simulation of building evacuation using a grid graph-based model

    Science.gov (United States)

    Tan, L.; Lin, H.; Hu, M.; Che, W.

    2014-02-01

    Shifting from macroscope models to microscope models, the agent-based approach has been widely used to model crowd evacuation as more attentions are paid on individualized behaviour. Since indoor evacuation behaviour is closely related to spatial features of the building, effective representation of indoor space is essential for the simulation of building evacuation. The traditional cell-based representation has limitations in reflecting spatial structure and is not suitable for topology analysis. Aiming at incorporating powerful topology analysis functions of GIS to facilitate agent-based simulation of building evacuation, we used a grid graph-based model in this study to represent the indoor space. Such model allows us to establish an evacuation network at a micro level. Potential escape routes from each node thus could be analysed through GIS functions of network analysis considering both the spatial structure and route capacity. This would better support agent-based modelling of evacuees' behaviour including route choice and local movements. As a case study, we conducted a simulation of emergency evacuation from the second floor of an official building using Agent Analyst as the simulation platform. The results demonstrate the feasibility of the proposed method, as well as the potential of GIS in visualizing and analysing simulation results.

  8. Simulating Observer in Supervisory Control- A Domain-based Method

    Directory of Open Access Journals (Sweden)

    Seyed Morteza Babamir

    2012-06-01

    Full Text Available An Observer in the supervisory control observes responses of a discrete system to events of its environment and reports an unsafe/ critical situation if the response is undesired. An undesired response from the system indicates the system response does not adhere to users’ requirements of the system. Therefore, events and conditions of the system environment and user’s requirements of the system are basic elements to observer in determining correctness of the system response. However, the noteworthy matter is that the events, conditions, and requirements should be defined based on data of problem domain because discrete data are primary ingredients of the environment in discrete systems and they are used by system users as a gauge to express their requirements playing a vital role in safety-critical systems, such as medical and avionic ones. A large quantity of methods has already been proposed to model and simulate supervisory control of discrete systems however, a systematic method relying on data of problem domain is missing. Having extracted events, conditions, and user’s requirements from data of problem domain, a Petri-Net automaton is constructed for identifying violation of user’s requirements. The net constitutes the core of the observer and it is used to identify undesired responses of the system. In the third step, run-time simulation of the observer is suggested using multithreading mechanism and Task Parallel Library (TPL technology of Microsoft. Finally, a case study of a discrete concurrent system is proposed, the method applied and simulation results are analyzed based on the system implementation on a multi-core computer.

  9. Neutron Source Facility Training Simulator Based on EPICS

    Energy Technology Data Exchange (ETDEWEB)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.; Grelle, Austin L.; Dworzanski, Pawel L.; Gohar, Yousry

    2015-01-01

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has been widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.

  10. A Scheduling Algorithm Based on Petri Nets and Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Rachida H. Ghoul

    2007-01-01

    Full Text Available This study aims at presenting a hybrid Flexible Manufacturing System "HFMS" short-term scheduling problem. Based on the art state of general scheduling algorithms, we present the meta-heuristic, we have decided to apply for a given example of HFMS. That was the study of Simulated Annealing Algorithm SA. The HFMS model based on hierarchical Petri nets, was used to represent static and dynamic behavior of the HFMS and design scheduling solutions. Hierarchical Petri nets model was regarded as being made up a set of single timed colored Petri nets models. Each single model represents one process which was composed of many operations and tasks. The complex scheduling problem was decomposed in simple sub-problems. Scheduling algorithm was applied on each sub model in order to resolve conflicts on shared production resources.

  11. Memoryless cooperative graph search based on the simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    Hou Jian; Yan Gang-Feng; Fan Zhen

    2011-01-01

    We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip consensus method based scheme is presented to update the key parameter-radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment.

  12. RELIABLE VALIDATION BASED ON OPTICAL FLOW VISUALIZATION FOR CFD SIMULATIONS

    Institute of Scientific and Technical Information of China (English)

    姜宗林

    2003-01-01

    A reliable validation based on the optical flow visualization for numerical simula-tions of complex flowfields is addressed in this paper. Several test cases, including two-dimensional,axisymmetric and three-dimensional flowfields, were presented to demonstrate the effectiveness of the validation and gain credibility of numerical solutions of complex flowfields. In the validation, imagesof these flowfields were constructed from numerical results based on the principle of the optical flowvisualization, and compared directly with experimental interferograms. Because both experimental and numerical results axe of identical physical representation, the agreement between them can be evaluatedeffectively by examining flow structures as well as checking discrepancies in density. The study shows that the reliable validation can be achieved by using the direct comparison between numerical and experiment results without any loss of accuracy in either of them.

  13. Tutorial on agent-based modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  14. SWIFT: task-based hydrodynamics and gravity for cosmological simulations

    CERN Document Server

    Theuns, Tom; Schaller, Matthieu; Gonnet, Pedro

    2015-01-01

    Simulations of galaxy formation follow the gravitational and hydrodynamical interactions between gas, stars and dark matter through cosmic time. The huge dynamic range of such calculations severely limits strong scaling behaviour of the community codes in use, with load-imbalance, cache inefficiencies and poor vectorisation limiting performance. The new swift code exploits task-based parallelism designed for many-core compute nodes interacting via MPI using asynchronous communication to improve speed and scaling. A graph-based domain decomposition schedules interdependent tasks over available resources. Strong scaling tests on realistic particle distributions yield excellent parallel efficiency, and efficient cache usage provides a large speed-up compared to current codes even on a single core. SWIFT is designed to be easy to use by shielding the astronomer from computational details such as the construction of the tasks or MPI communication. The techniques and algorithms used in SWIFT may benefit other compu...

  15. RELIABLE VALIDATION BASED ON OPTICAL FLOW VISUALIZATION FOR CFD SIMULATIONS

    Institute of Scientific and Technical Information of China (English)

    姜宗林

    2003-01-01

    A reliable validation based on the optical flow visualization for numerical simulations of complex flowfields is addressed in this paper.Several test cases,including two-dimensional,axisymmetric and three-dimensional flowfields,were presented to demonstrate the effectiveness of the validation and gain credibility of numerical solutions of complex flowfields.In the validation,images of these flowfields were constructed from numerical results based on the principle of the optical flow visualization,and compared directly with experimental interferograms.Because both experimental and numerical results are of identical physical representation,the agreement between them can be evaluated effectively by examining flow structures as well as checking discrepancies in density.The study shows that the reliable validation can be achieved by using the direct comparison between numerical and experiment results without any loss of accuracy in either of them.

  16. Enhancing food engineering education with interactive web-based simulations

    Directory of Open Access Journals (Sweden)

    Alexandros Koulouris

    2015-04-01

    Full Text Available In the traditional deductive approach in teaching any engineering topic, teachers would first expose students to the derivation of the equations that govern the behavior of a physical system and then demonstrate the use of equations through a limited number of textbook examples. This methodology, however, is rarely adequate to unmask the cause-effect and quantitative relationships between the system variables that the equations embody. Web-based simulation, which is the integration of simulation and internet technologies, has the potential to enhance the learning experience by offering an interactive and easily accessible platform for quick and effortless experimentation with physical phenomena.This paper presents the design and development of a web-based platform for teaching basic food engineering phenomena to food technology students. The platform contains a variety of modules (“virtual experiments” covering the topics of mass and energy balances, fluid mechanics and heat transfer. In this paper, the design and development of three modules for mass balances and heat transfer is presented. Each webpage representing an educational module has the following features: visualization of the studied phenomenon through graphs, charts or videos, computation through a mathematical model and experimentation.  The student is allowed to edit key parameters of the phenomenon and observe the effect of these changes on the outputs. Experimentation can be done in a free or guided fashion with a set of prefabricated examples that students can run and self-test their knowledge by answering multiple-choice questions.

  17. Analyst-to-Analyst Variability in Simulation-Based Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and open one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.

  18. Urban flood simulation based on the SWMM model

    Science.gov (United States)

    Jiang, L.; Chen, Y.; Wang, H.

    2015-05-01

    China is the nation with the fastest urbanization in the past decades which has caused serious urban flooding. Flood forecasting is regarded as one of the important flood mitigation methods, and is widely used in catchment flood mitigation, but is not widely used in urban flooding mitigation. This paper, employing the SWMM model, one of the widely used urban flood planning and management models, simulates the urban flooding of Dongguan City in the rapidly urbanized southern China. SWMM is first set up based on the DEM, digital map and underground pipeline network, then parameters are derived based on the properties of the subcatchment and the storm sewer conduits; the parameter sensitivity analysis shows the parameter robustness. The simulated results show that with the 1-year return period precipitation, the studied area will have no flooding, but for the 2-, 5-, 10- and 20-year return period precipitation, the studied area will be inundated. The results show the SWMM model is promising for urban flood forecasting, but as it has no surface runoff routing, the urban flooding could not be forecast precisely.

  19. Urban flood simulation based on the SWMM model

    Directory of Open Access Journals (Sweden)

    L. Jiang

    2015-05-01

    Full Text Available China is the nation with the fastest urbanization in the past decades which has caused serious urban flooding. Flood forecasting is regarded as one of the important flood mitigation methods, and is widely used in catchment flood mitigation, but is not widely used in urban flooding mitigation. This paper, employing the SWMM model, one of the widely used urban flood planning and management models, simulates the urban flooding of Dongguan City in the rapidly urbanized southern China. SWMM is first set up based on the DEM, digital map and underground pipeline network, then parameters are derived based on the properties of the subcatchment and the storm sewer conduits; the parameter sensitivity analysis shows the parameter robustness. The simulated results show that with the 1-year return period precipitation, the studied area will have no flooding, but for the 2-, 5-, 10- and 20-year return period precipitation, the studied area will be inundated. The results show the SWMM model is promising for urban flood forecasting, but as it has no surface runoff routing, the urban flooding could not be forecast precisely.

  20. Physics-Based Haptic Simulation of Bone Machining.

    Science.gov (United States)

    Arbabtafti, M; Moghaddam, M; Nahvi, A; Mahvash, M; Richardson, B; Shirinzadeh, B

    2011-01-01

    We present a physics-based training simulator for bone machining. Based on experimental studies, the energy required to remove a unit volume of bone is a constant for every particular bone material. We use this physical principle to obtain the forces required to remove bone material with a milling tool rotating at high speed. The rotating blades of the tool are modeled as a set of small cutting elements. The force of interaction between a cutting element and bone is calculated from the energy required to remove a bone chip with an estimated thickness and known material stiffness. The total force acting on the cutter at a particular instant is obtained by integrating the differential forces over all cutting elements engaged. A voxel representation is used to represent the virtual bone and removed chips for calculating forces of machining. We use voxels that carry bone material properties to represent the volumetric haptic body and to apply underlying physical changes during machining. Experimental results of machining samples of a real bone confirm the force model. A real-time haptic implementation of the method in a dental training simulator is described.

  1. Simulation Based Studies in Software Engineering: A Matter of Validity

    Directory of Open Access Journals (Sweden)

    Breno Bernard Nicolau de França

    2015-04-01

    Full Text Available Despite the possible lack of validity when compared with other science areas, Simulation-Based Studies (SBS in Software Engineering (SE have supported the achievement of some results in the field. However, as it happens with any other sort of experimental study, it is important to identify and deal with threats to validity aiming at increasing their strength and reinforcing results confidence. OBJECTIVE: To identify potential threats to SBS validity in SE and suggest ways to mitigate them. METHOD: To apply qualitative analysis in a dataset resulted from the aggregation of data from a quasi-systematic literature review combined with ad-hoc surveyed information regarding other science areas. RESULTS: The analysis of data extracted from 15 technical papers allowed the identification and classification of 28 different threats to validity concerned with SBS in SE according Cook and Campbell’s categories. Besides, 12 verification and validation procedures applicable to SBS were also analyzed and organized due to their ability to detect these threats to validity. These results were used to make available an improved set of guidelines regarding the planning and reporting of SBS in SE. CONCLUSIONS: Simulation based studies add different threats to validity when compared with traditional studies. They are not well observed and therefore, it is not easy to identify and mitigate all of them without explicit guidance, as the one depicted in this paper.

  2. Activity-based simulation using DEVS:increasing performance by an activity model in parallel DEVS simulation

    Institute of Scientific and Technical Information of China (English)

    Bin CHEN; Lao-bing ZHANG; Xiao-cheng LIU; Hans VANGHELUWE

    2014-01-01

    Improving simulation performance using activity tracking has attracted attention in the modeling field in recent years. The reference to activity has been successfully used to predict and promote the simulation performance. Tracking activity, how-ever, uses only the inherent performance information contained in the models. To extend activity prediction in modeling, we propose the activity enhanced modeling with an activity meta-model at the meta-level. The meta-model provides a set of interfaces to model activity in a specific domain. The activity model transformation in subsequence is devised to deal with the simulation difference due to the heterogeneous activity model. Finally, the resource-aware simulation framework is implemented to integrate the activity models in activity-based simulation. The case study shows the improvement brought on by activity-based simulation using discrete event system specification (DEVS).

  3. Monte-Carlo simulations of different concepts for shielding in the ATLAS experiment forward region

    CERN Document Server

    Stekl, I; Eschbach, R; Kovalenko, V E; Leroy, C; Marquet, C; Palla, J; Piquemal, F; Pospísil, S; Shupe, M A; Sodomka, J; Tourneur, S; Vorobel, V

    2001-01-01

    The role and performance of various layers (steel, cast iron (CI), concrete, lead, borated polyethylene (BPE), lithium filled polyethylene (LiPE)) and their combinations as shielding against neutrons and photons in the ATLAS experiment forward region (JF shielding) has been studied by means of Monte-Carlo simulations. These simulations permitted one to determine the locations of appearance and disappearance of neutrons and photons and their number at this location. In particular, the determination of the number of newly born neutrons and photons, the number of stopped neutrons and photons, as well as the number of neutrons and photons crossing the borders of shielding layers allowed the assessment of the efficiency of the JF shielding. It provided a basis for comparing the merits of different configurations of shielding layers. The simulation code is based on GEANT, FLUKA, MICAP and GAMLIB. The results of the study give strong support to a segmented shielding made of five layers (steel, CI, BPE, steel, LiPE).

  4. Simulating cancer growth with multiscale agent-based modeling.

    Science.gov (United States)

    Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S

    2015-02-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models.

  5. Simulation-based decision support for evaluating operational plans

    Directory of Open Access Journals (Sweden)

    Johan Schubert

    2015-12-01

    Full Text Available In this article, we describe simulation-based decision support techniques for evaluation of operational plans within effects-based planning. Using a decision support tool, developers of operational plans are able to evaluate thousands of alternative plans against possible courses of events and decide which of these plans are capable of achieving a desired end state. The objective of this study is to examine the potential of a decision support system that helps operational analysts understand the consequences of numerous alternative plans through simulation and evaluation. Operational plans are described in the effects-based approach to operations concept as a set of actions and effects. For each action, we examine several different alternative ways to perform the action. We use a representation where a plan consists of several actions that should be performed. Each action may be performed in one of several different alternative ways. Together these action alternatives make up all possible plan instances, which are represented as a tree of action alternatives that may be searched for the most effective sequence of alternative actions. As a test case, we use an expeditionary operation with a plan of 43 actions and several alternatives for these actions, as well as a scenario of 40 group actors. Decision support for planners is provided by several methods that analyze the impact of a plan on the 40 actors, e.g., by visualizing time series of plan performance. Detailed decision support for finding the most influential actions of a plan is presented by using sensitivity analysis and regression tree analysis. Finally, a decision maker may use the tool to determine the boundaries of an operation that it must not move beyond without risk of drastic failure. The significant contribution of this study is the presentation of an integrated approach for evaluation of operational plans.

  6. Simulation-Based Design of a Rotatory SMA Drive

    Science.gov (United States)

    Dilthey, Sascha; Meier, Horst

    2009-08-01

    The design and optimization of a rotatory drive powered by shape memory alloy (SMA) actuators is described in this paper. SMA actuators used in technical applications are parameterized by the use of trial-and-error methods, because there is a lack of computer-aided design tools for this active material. A numerical modeling approach was developed to design and optimize the geometry and the load and heating conditions of SMA actuators in a technical system to achieve a good dynamic and a high reliability. The shape memory effect used in most technical systems is the extrinsic two way effect (2WE). This effect can be simulated with the numerical model which was implemented in MATLAB/SIMULINK. The focus of the model is on the activation behavior of the SMA actuator, which defines its rate of heating and cooling. Different load conditions and various actuator geometries and shapes, e.g. wire or spring actuator, are simulated by the calculation of the energetic balance of the whole system. The numerical model can be used to simulate time variant heating currents in order to obtain an optimal system performance. The model was used to design a rotatory SMA-drive system, which is based on the moving concept of a wave drive gear set. In contrast to the conventional system, which is driven by an electric motor, the SMA drive consists of a strain wave gear and SMA wire actuators that are applied circularly to generate a rotatory movement. Special characteristics of this drive system are a high torque density and a high positioning accuracy.

  7. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  8. Design, modeling and simulation of MEMS-based silicon Microneedles

    Science.gov (United States)

    Amin, F.; Ahmed, S.

    2013-06-01

    The advancement in semiconductor process engineering and nano-scale fabrication technology has made it convenient to transport specific biological fluid into or out of human skin with minimum discomfort. Fluid transdermal delivery systems such as Microneedle arrays are one such emerging and exciting Micro-Electro Mechanical System (MEMS) application which could lead to a total painless fluid delivery into skin with controllability and desirable yield. In this study, we aimed to revisit the problem with modeling, design and simulations carried out for MEMS based silicon hollow out of plane microneedle arrays for biomedical applications particularly for transdermal drug delivery. An approximate 200 μm length of microneedle with 40 μm diameter of lumen has been successfully shown formed by isotropic and anisotropic etching techniques using MEMS Pro design tool. These microneedles are arranged in size of 2 × 4 matrix array with center to center spacing of 750 μm. Furthermore, comparisons for fluid flow characteristics through these microneedle channels have been modeled with and without the contribution of the gravitational forces using mathematical models derived from Bernoulli Equation. Physical Process simulations have also been performed on TCAD SILVACO to optimize the design of these microneedles aligned with the standard Si-Fabrication lines.

  9. Optimal grid-based methods for thin film micromagnetics simulations

    Science.gov (United States)

    Muratov, C. B.; Osipov, V. V.

    2006-08-01

    Thin film micromagnetics are a broad class of materials with many technological applications, primarily in magnetic memory. The dynamics of the magnetization distribution in these materials is traditionally modeled by the Landau-Lifshitz-Gilbert (LLG) equation. Numerical simulations of the LLG equation are complicated by the need to compute the stray field due to the inhomogeneities in the magnetization which presents the chief bottleneck for the simulation speed. Here, we introduce a new method for computing the stray field in a sample for a reduced model of ultra-thin film micromagnetics. The method uses a recently proposed idea of optimal finite difference grids for approximating Neumann-to-Dirichlet maps and has an advantage of being able to use non-uniform discretization in the film plane, as well as an efficient way of dealing with the boundary conditions at infinity for the stray field. We present several examples of the method's implementation and give a detailed comparison of its performance for studying domain wall structures compared to the conventional FFT-based methods.

  10. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    Science.gov (United States)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  11. Design and Simulation of an Electrothermal Actuator Based Rotational Drive

    Science.gov (United States)

    Beeson, Sterling; Dallas, Tim

    2008-10-01

    As a participant in the Micro and Nano Device Engineering (MANDE) Research Experience for Undergraduates program at Texas Tech University, I learned how MEMS devices operate and the limits of their operation. Using specialized AutoCAD-based design software and the ANSYS simulation program, I learned the MEMS fabrication process used at Sandia National Labs, the design limitations of this process, the abilities and drawbacks of micro devices, and finally, I redesigned a MEMS device called the Chevron Torsional Ratcheting Actuator (CTRA). Motion is achieved through electrothermal actuation. The chevron (bent-beam) actuators cause a ratcheting motion on top of a hub-less gear so that as voltage is applied the CTRA spins. The voltage applied needs to be pulsed and the frequency of the pulses determine the angular frequency of the device. The main objective was to design electromechanical structures capable of transforming the electrical signals into mechanical motion without overheating. The design was optimized using finite element analysis in ANSYS allowing multi-physics simulations of our model system.

  12. UAV based distributed ATR under realistic simulated environmental effects

    Science.gov (United States)

    Chen, Xiaohan; Gong, Shanshan; Schmid, Natalia A.; Valenti, Matthew C.

    2007-04-01

    Over the past several years, the military has grown increasingly reliant upon the use of unattended aerial vehicles (UAVs) for surveillance missions. There is an increasing trend towards fielding swarms of UAVs operating as large-scale sensor networks in the air. Such systems tend to be used primarily for the purpose of acquiring sensory data with the goal of automatic detection, identification, and tracking objects of interest. These trends have been paralleled by advances in both distributed detection, image/signal processing and data fusion techniques. Furthermore, swarmed UAV systems must operate under severe constraints on environmental conditions and sensor limitations. In this work, we investigate the effects of environmental conditions on target detection and recognition performance in a UAV network. We assume that each UAV is equipped with an optical camera, and use a realistic computer simulation to generate synthetic images. The detection algorithm relies on Haar-based features while the automatic target recognition (ATR) algorithm relies on Bessel K features. The performance of both algorithms is evaluated using simulated images that closely mimic data acquired in a UAV network under realistic environmental conditions. We design several fusion techniques and analyze both the case of a single observation and the case of multiple observations of the same target.

  13. Simulated Annealing-Based Krill Herd Algorithm for Global Optimization

    Directory of Open Access Journals (Sweden)

    Gai-Ge Wang

    2013-01-01

    Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.

  14. Induced radioactivity analysis for the NSRL Linac in China using Monte Carlo simulations and gamma-spectroscopy

    CERN Document Server

    He, Lijuan; Li, Weimin; Chen, Zhi; Chen, Yukai; Ren, Guangyi

    2014-01-01

    The 200-MeV electron linac of the National Synchrotron Radiation Laboratory (NSRL) located in Hefei is one of the earliest high-energy electron linear accelerators in China. The electrons are accelerated to 200 MeV by five acceleration tubes and are collimated by scrapers. The scraper aperture is smaller than the acceleration tube one, so some electrons hit the materials when passing through them. These lost electrons cause induced radioactivity mainly due to bremsstrahlung and photonuclear reaction. This paper describes a study of induced radioactivity for the NSRL Linac using FLUKA simulations and gamma-spectroscopy. The measurements showed that electrons were lost mainly at the scraper. So the induced radioactivity of the NSRL Linac is mainly produced here. The radionuclide types were simulated using the FLUKA Monte Carlo code and the results were compared against measurements made with a High Purity Germanium (HPGe) gamma spectrometer. The NSRL linac had been retired because of upgrading last year. The re...

  15. OPTIMIZATION METHOD FOR VIRTUAL PRODUCT DEVELOPMENT BASED ON SIMULATION METAMODEL AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Pan Jun; Fan Xiumin; Ma Dengzhe; Jin Ye

    2003-01-01

    Virtual product development (VPD) is essentially based on simulation. Due to computational inefficiency, traditional engineering simulation software and optimization methods are inadequate to analyze optimization problems in VPD. Optimization method based on simulation metamodel for virtual product development is proposed to satisfy the needs of complex optimal designs driven by VPD. This method extends the current design of experiments (DOE) by various metamodeling technologies. Simulation metamodels are built to approximate detailed simulation codes, so as to provide link between optimization and simulation, or serve as a bridge for simulation software integration among different domains. An example of optimal design for composite material structure is used to demonstrate the newly introduced method.

  16. Simulation of Irrigation Water Loss Based on VSMB Model

    Institute of Scientific and Technical Information of China (English)

    Hongwen ZHOU; Luxin ZHAI; Wenxing LU; Dongxu LIU

    2016-01-01

    The low degree of development and utilization as well as the contradiction between supply and demand of water resources in Huangshui River basin are the main restricting factors of the local agricultural development. The study on the simulation of irrigation water loss based on the VSMB model has very important significance to strengthening regional water management and improving water resource utilization efficiency. Five groundwater wells were set up to carry out the farmland irrigation water infiltration and the experimental study on groundwater dynamic effect. Two soil moisture monitoring sites were set up in two typical plots of Daxia and Guanting irrigation area at the same time and TDR300 was used to monitor four kinds of deep soil moisture( 10 cm,30 cm,50 cm and 70 cm). On this basis,the VSMB model was used to study the irrigation water loss in the irrigation area of Yellow River valley of Qinghai Province,including soil moisture content,the actual evapotranspiration,infiltration,runoff,groundwater buried depth and so on. The results showed that the water consumption caused by soil evaporation and crop transpiration accounted for 46. 4% and 24. 1% of the total precipitation plus irrigation,respectively,and the leakage accounted for 30. 3% and 60. 6% of the total precipitation plus irrigation,respectively,from March 1,2013 to April 30,and from August 1 to September 30. The actual evaporation of the GT- TR1 and GT- TR2 sites in the whole year of 2013 was 632. 6 mm and 646. 9 mm,respectively,and the leakage accounted for 2. 6% and 1. 2% of the total precipitation plus irrigation,respectively. RMSE of the simulation results of the groundwater depth in Daxia irrigation area during the two periods was 92. 3 mm and 27. 7 mm,respectively. And RMSE of the simulation results of the water content of soil profile in the two monitoring sites of Guanting irrigation area was 2. 04% and 5. 81%,respectively,indicating that the simulation results were reliable.

  17. Games for Traffic Education: An Experimental Study of a Game-Based Driving Simulator

    Science.gov (United States)

    Backlund, Per; Engstrom, Henrik; Johannesson, Mikael; Lebram, Mikael

    2010-01-01

    In this article, the authors report on the construction and evaluation of a game-based driving simulator using a real car as a joystick. The simulator is constructed from off-the-shelf hardware and the simulation runs on open-source software. The feasibility of the simulator as a learning tool has been experimentally evaluated. Results are…

  18. Micromechanics based simulation of ductile fracture in structural steels

    Science.gov (United States)

    Yellavajjala, Ravi Kiran

    The broader aim of this research is to develop fundamental understanding of ductile fracture process in structural steels, propose robust computational models to quantify the associated damage, and provide numerical tools to simplify the implementation of these computational models into general finite element framework. Mechanical testing on different geometries of test specimens made of ASTM A992 steels is conducted to experimentally characterize the ductile fracture at different stress states under monotonic and ultra-low cycle fatigue (ULCF) loading. Scanning electron microscopy studies of the fractured surfaces is conducted to decipher the underlying microscopic damage mechanisms that cause fracture in ASTM A992 steels. Detailed micromechanical analyses for monotonic and cyclic loading are conducted to understand the influence of stress triaxiality and Lode parameter on the void growth phase of ductile fracture. Based on monotonic analyses, an uncoupled micromechanical void growth model is proposed to predict ductile fracture. This model is then incorporated in to finite element program as a weakly coupled model to simulate the loss of load carrying capacity in the post microvoid coalescence regime for high triaxialities. Based on the cyclic analyses, an uncoupled micromechanics based cyclic void growth model is developed to predict the ULCF life of ASTM A992 steels subjected to high stress triaxialities. Furthermore, a computational fracture locus for ASTM A992 steels is developed and incorporated in to finite element program as an uncoupled ductile fracture model. This model can be used to predict the ductile fracture initiation under monotonic loading in a wide range of triaxiality and Lode parameters. Finally, a coupled microvoid elongation and dilation based continuum damage model is proposed, implemented, calibrated and validated. This model is capable of simulating the local softening caused by the various phases of ductile fracture process under

  19. Physics validation of detector simulation tools for LHC

    CERN Document Server

    Beringer, J

    2004-01-01

    Extensive studies aimed at validating the physics processes built into the detector simulation tools Geant4 and Fluka are in progress within all Large Hadron Collider (LHC) experiments, within the collaborations developing these tools, and within the LHC Computing Grid (LCG) Simulation Physics Validation Project, which has become the primary forum for these activities. This work includes detailed comparisons with test beam data, as well as benchmark studies of simple geometries and materials with single incident particles of various energies for which experimental data is available. We give an overview of these validation activities with emphasis on the latest results.

  20. Simulation-based biagnostics and control for nuclar power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.C.

    1993-01-01

    Advanced simulation-based diagnostics and control guidance systems for the identification and management of off-normal transient events in nuclear power plants is currently under investigation. To date a great deal of progress has been made in effectively and efficiently combining information obtained through fuzzy pattern recognition and macroscopic mass and energy inventory analysis for use in multiple failure diagnostics. Work has also begun on the unique problem of diagnostics and surveillance methodologies for advanced passively-safe reactors systems utilizing both statistical and fuzzy information. Plans are also being formulated for the development of deterministic optimal control algorithms combined with Monte Carlo incremental learning algorithms to be used for the flexible and efficient control of reactor transients.

  1. Simulation-based algorithms for Markov decision processes

    CERN Document Server

    Chang, Hyeong Soo; Fu, Michael C; Marcus, Steven I

    2013-01-01

    Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search. This substantially enlarged new edition reflects the latest developments in novel ...

  2. Capacity Analysis for Parallel Runway through Agent-Based Simulation

    Directory of Open Access Journals (Sweden)

    Yang Peng

    2013-01-01

    Full Text Available Parallel runway is the mainstream structure of China hub airport, runway is often the bottleneck of an airport, and the evaluation of its capacity is of great importance to airport management. This study outlines a model, multiagent architecture, implementation approach, and software prototype of a simulation system for evaluating runway capacity. Agent Unified Modeling Language (AUML is applied to illustrate the inbound and departing procedure of planes and design the agent-based model. The model is evaluated experimentally, and the quality is studied in comparison with models, created by SIMMOD and Arena. The results seem to be highly efficient, so the method can be applied to parallel runway capacity evaluation and the model propose favorable flexibility and extensibility.

  3. Bioinfogrid:. Bioinformatics Simulation and Modeling Based on Grid

    Science.gov (United States)

    Milanesi, Luciano

    2007-12-01

    Genomics sequencing projects and new technologies applied to molecular genetics analysis are producing huge amounts of raw data. In future the trend of the biomedical scientific research will be based on computing Grids for data crunching applications, data Grids for distributed storage of large amounts of accessible data and the provision of tools to all users. Biomedical research laboratories are moving towards an environment, created through the sharing of resources, in which heterogeneous and dispersed health data, such as molecular data (e.g. genomics, proteomics), cellular data (e.g. pathways), tissue data, population data (e.g. Genotyping, SNP, Epidemiology), as well the data generated by large scale analysis (eg. Simulation data, Modelling). In this paper some applications developed in the framework of the European Project "Bioinformatics Grid Application for life science - BioinfoGRID" will be described in order to show the potentiality of the GRID to carry out large scale analysis and research worldwide.

  4. Three-dimensional fracture simulations based on the SDA

    Science.gov (United States)

    Feist, C.; Hofstetter, G.

    2007-02-01

    A numerical model within the framework of a non-symmetric strong discontinuity approach (SDA) suitable for fracture simulations of plain concrete is presented. The model is based on the fixed crack concept and is formulated within the framework of elements with embedded discontinuities. Discontinuity segments of individual elements are considered to form a C0-continuous surface. Enforcement of continuity of the crack surface across adjacent elements is established by the so-called partial domain crack tracking algorithm (PDTA). Orientation of individual crack segments is derived from a non-local strain field. Within the present work emphasis is put on the formulation for the three-dimensional case. The implications on the respective algorithms are highlighted. The capabilities of the model are shown by means of numerical examples. Copyright

  5. Design and simulation of a standing wave oscillator based PLL

    Institute of Scientific and Technical Information of China (English)

    Wei ZHANG; You-de HU; Li-rong ZHENG

    2016-01-01

    A standing wave oscillator (SWO) is a perfect clock source which can be used to produce a high frequency clock signal with a low skew and high reliability. However, it is difficult to tune the SWO in a wide range of frequencies. We introduce a frequency tunable SWO which uses an inversion mode metal-oxide-semiconductor (IMOS) field-effect transistor as a varactor, and give the simulation results of the frequency tuning range and power dissipation. Based on the frequency tunable SWO, a new phase locked loop (PLL) architecture is presented. This PLL can be used not only as a clock source, but also as a clock distribution network to provide high quality clock signals. The PLL achieves an approximately 50% frequency tuning range when designed in Global Foundry 65 nm 1P9M complementary metal-oxide-semiconductor (CMOS) technology, and can be used directly in a high performance multi-core microprocessor.

  6. Optimization-based Fluid Simulation on Unstructured Meshes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Bridson, Robert; Erleben, Kenny;

    We present a novel approach to fluid simulation, allowing us to take into account the surface energy in a pre- cise manner. This new approach combines a novel, topology-adaptive approach to deformable interface track- ing, called the deformable simplicial complexes method (DSC) with an optimization......-based, linear finite element method for solving the incompressible Euler equations. The deformable simplicial complexes track the surface of the fluid: the fluid-air interface is represented explicitly as a piecewise linear surface which is a subset of tetra- hedralization of the space, such that the interface...... can be also represented implicitly as a set of faces separating tetrahedra marked as inside from the ones marked as outside. This representation introduces insignificant and con- trollable numerical diffusion, allows robust topological adaptivity and provides both a volumetric finite element mesh...

  7. Design and simulations for the detector based on DSSSD

    Institute of Scientific and Technical Information of China (English)

    XU Yan-Bing; ZHAO Xiao-Yun; WU Feng; WANG Huan-Yu; MENG Xiang-Cheng; WANG Hui; LU Hong; MA Yu-Qian; LI Xin-Qiao; SHI Feng; WANG Ping

    2010-01-01

    The present paper describes the design and simulation results of a position-sensitive charged particle detector based on the Double Sided Silicon Strip Detector (DSSSD). Also, the characteristics of the DSSSD and its testing result were are discussed. With the application of the DSSSD, the position-sensitive charged particle detector can not only give particle flux and energy spectra information and identify different types of charged particles, but also measure the location and angle of incident particles. As the detector can make multi-parameter measurements of charged particles, it is widely used in space detection and exploration missions, such as charged particle detection related to earthquakes, space environment monitoring and solar activity inspection.

  8. Agent-based simulation of a financial market

    Science.gov (United States)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  9. Quantum-based Atomistic Simulation of Transition Metals

    Energy Technology Data Exchange (ETDEWEB)

    Moriarty, J A; Benedict, L X; Glosli, J N; Hood, R Q; Orlikowski, D A; Patel, M V; Soderlind, P; Streitz, F H; Tang, M; Yang, L H

    2005-08-29

    First-principles generalized pseudopotential theory (GPT) provides a fundamental basis for transferable multi-ion interatomic potentials in d-electron transition metals within density-functional quantum mechanics. In mid-period bcc metals, where multi-ion angular forces are important to structural properties, simplified model GPT or MGPT potentials have been developed based on canonical d bands to allow analytic forms and large-scale atomistic simulations. Robust, advanced-generation MGPT potentials have now been obtained for Ta and Mo and successfully applied to a wide range of structural, thermodynamic, defect and mechanical properties at both ambient and extreme conditions of pressure and temperature. Recent algorithm improvements have also led to a more general matrix representation of MGPT beyond canonical bands allowing increased accuracy and extension to f-electron actinide metals, an order of magnitude increase in computational speed, and the current development of temperature-dependent potentials.

  10. A review of undergraduate interprofessional simulation-based education (IPSE).

    Science.gov (United States)

    Gough, Suzanne; Hellaby, Mark; Jones, Neal; MacKinnon, Ralph

    2012-01-01

    Interprofessional simulation-based education (IPSE) is becoming an increasingly popular educational strategy worldwide within undergraduate healthcare curricular. The purpose of the literature review was to examine qualitative, quantitative and mixed/multi-method research studies featuring undergraduate IPSE. A literature review was conducted using CINAHL, MEDLINE, and PsycINFO databases from January 1999 to September 2011 and pre-set criteria. The criteria used to screen all 120 abstracts included: (a) the article pertained to both simulation and undergraduate IPE and (b) the article reported a research study. Eighteen articles which met the pre-set criteria were included in the literature review. All studies featured outcome measures; many were purposely designed and lacked psychometric development and evaluation. Key IPSE drivers included capacity planning, preparedness for disaster management and improving patient care through the evaluation of teambuilding, teamwork skills or communicating within inter-disciplinary teams. Studies evaluated/explored either student or teacher perspectives of learning within the context of IPSE or both. The IPSE learning processes varied considerably in relation to duration, fidelity and professions involved. The scenarios ranged from managing adults admitted to hospital settings, mass casualty/mock disaster patient management to the use of training wards. The majority of the articles identified common IPSE outcomes relating to increased confidence, knowledge, leadership, teamwork, and communication skills. Based on the findings of this review, the authors suggest that further multi-site, longitudinal research studies are required to provide evidence of the transferability of skills developed during IPSE and their overall impact on both undergraduate education and healthcare.

  11. Fundamental Science-Based Simulation of Nuclear Waste Forms

    Energy Technology Data Exchange (ETDEWEB)

    Devanathan, Ramaswami; Gao, Fei; Sun, Xin; Khaleel, Mohammad A.

    2010-10-04

    This report presents a hierarchical multiscale modeling scheme based on two-way information exchange. To account for all essential phenomena in waste forms over geological time scales, the models have to span length scales from nanometer to kilometer and time scales from picoseconds to millenia. A single model cannot cover this wide range and a multi-scale approach that integrates a number of different at-scale models is called for. The approach outlined here involves integration of quantum mechanical calculations, classical molecular dynamics simulations, kinetic Monte Carlo and phase field methods at the mesoscale, and continuum models. The ultimate aim is to provide science-based input in the form of constitutive equations to integrated codes. The atomistic component of this scheme is demonstrated in the promising waste form xenotime. Density functional theory calculations have yielded valuable information about defect formation energies. This data can be used to develop interatomic potentials for molecular dynamics simulations of radiation damage. Potentials developed in the present work show a good match for the equilibrium lattice constants, elastic constants and thermal expansion of xenotime. In novel waste forms, such as xenotime, a considerable amount of data needed to validate the models is not available. Integration of multiscale modeling with experimental work is essential to generate missing data needed to validate the modeling scheme and the individual models. Density functional theory can also be used to fill knowledge gaps. Key challenges lie in the areas of uncertainty quantification, verification and validation, which must be performed at each level of the multiscale model and across scales. The approach used to exchange information between different levels must also be rigorously validated. The outlook for multiscale modeling of wasteforms is quite promising.

  12. Airworthiness Compliance Verification Method Based on Simulation of Complex System

    Institute of Scientific and Technical Information of China (English)

    XU Haojun; LIU Dongliang; XUE Yuan; ZHOU Li; MIN Guilong

    2012-01-01

    A study is conducted on a new airworthiness compliance verification method based on pilot-aircraft-environment complex system simulation.Verification scenarios are established by “block diagram” method based on airworthiness criteria..A pilot-aircraft-environment complex model is set up and a virtual flight testing method based on connection of MATLAB/Simulink and Flightgear is proposed.Special researches are conducted on the modeling of pilot manipulation stochastic parameters and manipulation in critical situation.Unfavorable flight factors of certain scenario are analyzed,and reliability modeling of important system is researched.A distribution function of small probability event and the theory on risk probability measurement are studied.Nonlinear function is used to depict the relationship between the cumulative probability and the extremum of the critical parameter.A synthetic evaluation model is set up,modified genetic algorithm (MGA) is applied to ascertaining the distribution parameter in the model,and amore reasonable result is obtained.A clause about vehicle control functions (VCFs) verification in MIL-HDBK-516B is selected as an example to validate the practicability of the method.

  13. Agent-based modeling and simulation Part 3 : desktop ABMS.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  14. Real-time retrieval for case-based reasoning in interactive multiagent-based simulations

    CERN Document Server

    De Loor, Pierre; Pierre, Chevaillier; 10.1016/j.eswa.2010.10.048

    2011-01-01

    The aim of this paper is to present the principles and results about case-based reasoning adapted to real- time interactive simulations, more precisely concerning retrieval mechanisms. The article begins by introducing the constraints involved in interactive multiagent-based simulations. The second section pre- sents a framework stemming from case-based reasoning by autonomous agents. Each agent uses a case base of local situations and, from this base, it can choose an action in order to interact with other auton- omous agents or users' avatars. We illustrate this framework with an example dedicated to the study of dynamic situations in football. We then go on to address the difficulties of conducting such simulations in real-time and propose a model for case and for case base. Using generic agents and adequate case base structure associated with a dedicated recall algorithm, we improve retrieval performance under time pressure compared to classic CBR techniques. We present some results relating to the perfor...

  15. Numerical simulation of base flow with hot base bleed for two jet models

    Institute of Scientific and Technical Information of China (English)

    Wen-jie YU; Yong-gang YU; Bin NI

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric NaviereStokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an annulus jet model are investigated by selecting the injection temperature from 830 K to 2200 K. The results show that the base pressure of the annular jet model is higher than that of the circular jet model with the changes of the injection parameter and the injection temperature. For the circular jet model, the hot gases are concentrated in the vicinity of the base. For the annular jet model, the bleed gases flow into the shear layer directly so that the hot gases are concentrated in the shear layer. The latter temperature distribution is better for the increase of base pressure.

  16. A Simulated Multiagent-Based Architecture for Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Onashoga S. Adebukola

    2013-04-01

    Full Text Available In this work, a Multiagent-based architecture for Intrusion Detection System (MIDS is proposed to overcome the shortcoming of current Mobile Agent-based Intrusion Detection System. MIDS is divided into three major phases namely: Data gathering, Detection and the Response phases. The data gathering stage involves data collection based on the features in the distributed system and profiling. The data collection components are distributed on both host and network. Closed Pattern Mining (CPM algorithm is introduced for profiling users’ activities in network database. The CPM algorithm is built on the concept of Frequent Pattern-growth algorithm by mining a prefix-tree called CPM-tree, which contains only the closed itemsets and its associated support count. According to the administrator’s specified thresholds, CPM-tree maintains only closed patterns online and incrementally outputs the current closed frequent pattern of users’ activities in real time. MIDS makes use of mobile and static agents to carry out the functions of intrusion detection. Each of these agents is built with rule-based reasoning to autonomously detect intrusions. Java 1.1.8 is chosen as the implementation language and IBM’s Java based mobile agent framework, Aglet 1.0.3 as the platform for running the mobile and static agents. In order to test the robustness of the system, a real-time simulation is carried out on University of Agriculture, Abeokuta (UNAAB network dataset and the results showed an accuracy of 99.94%, False Positive Rate (FPR of 0.13% and False Negative Rate (FNR of 0.04%. This shows an improved performance of MIDS when compared with other known MA-IDSs.

  17. Web-based Interactive Landform Simulation Model - Grand Canyon

    Science.gov (United States)

    Luo, W.; Pelletier, J. D.; Duffin, K.; Ormand, C. J.; Hung, W.; Iverson, E. A.; Shernoff, D.; Zhai, X.; Chowdary, A.

    2013-12-01

    Earth science educators need interactive tools to engage and enable students to better understand how Earth systems work over geologic time scales. The evolution of landforms is ripe for interactive, inquiry-based learning exercises because landforms exist all around us. The Web-based Interactive Landform Simulation Model - Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is a continuation and upgrade of the simple cellular automata (CA) rule-based model (WILSIM-CA, http://www.niu.edu/landform/) that can be accessed from anywhere with an Internet connection. Major improvements in WILSIM-GC include adopting a physically based model and the latest Java technology. The physically based model is incorporated to illustrate the fluvial processes involved in land-sculpting pertaining to the development and evolution of one of the most famous landforms on Earth: the Grand Canyon. It is hoped that this focus on a famous and specific landscape will attract greater student interest and provide opportunities for students to learn not only how different processes interact to form the landform we observe today, but also how models and data are used together to enhance our understanding of the processes involved. The latest development in Java technology (such as Java OpenGL for access to ubiquitous fast graphics hardware, Trusted Applet for file input and output, and multithreaded ability to take advantage of modern multi-core CPUs) are incorporated into building WILSIM-GC and active, standards-aligned curricula materials guided by educational psychology theory on science learning will be developed to accompany the model. This project is funded NSF-TUES program.

  18. The Geant4-Based ATLAS Fast Electromagnetic Shower Simulation

    CERN Document Server

    Barberio, E; Butler, B; Cheung, S L; Dell'Acqua, A; Di Simone, A; Ehrenfeld, W; Gallas, M V; Glasow, A; Hughes, E; Marshall, Z; Müller, J; Placakyte, R; Rimoldi, A; Savard, P; Tsulaia, V; Waugh, A; Young, C C; 10th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications

    2008-01-01

    We present a three-pronged approach to fast electromagnetic shower simulation in ATLAS. Parameterisation is used for high-energy, shower libraries for medium-energy, and an averaged energy deposition for very low-energy particles. We present a comparison between the fast simulation and full simulation in an ATLAS Monte Carlo production.

  19. Petascale Simulation Initiative Tech Base: FY2007 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    May, J; Chen, R; Jefferson, D; Leek, J; Kaplan, I; Tannahill, J

    2007-10-26

    The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscale and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending

  20. Time efficient aeroelastic simulations based on radial basis functions

    Science.gov (United States)

    Liu, Wen; Huang, ChengDe; Yang, Guowei

    2017-02-01

    Aeroelasticity studies the interaction between aerodynamic forces and structural responses, and is one of the fundamental problems to be considered in the design of modern aircraft. The fluid-structure interpolation (FSI) and mesh deformation are two key issues in the CFD-CSD coupling approach (the partitioned approach), which is the mainstream numerical strategy in aeroelastic simulations. In this paper, a time efficient coupling scheme is developed based on the radial basis function interpolations. During the FSI process, the positive definite system of linear equations is constructed with the introduction of pseudo structural forces. The acting forces on the structural nodes can be calculated more efficiently via the solution of the linear system, avoiding the costly computations of the aerodynamic/structural coupling matrix. The multi-layer sequential mesh motion algorithm (MSM) is proposed to improve the efficiency of the volume mesh deformations, which is adequate for large-scale time dependent applications with frequent mesh updates. Two-dimensional mesh motion cases show that the MSM algorithm can reduce the computing cost significantly compared to the standard RBF-based method. The computations of the AGARD 445.6 wing flutter and the static deflections of the three-dimensional high-aspect-ratio aircraft demonstrate that the developed coupling scheme is applicable to both dynamic and static aeroelastic problems.

  1. Simulation environment based on the Universal Verification Methodology

    Science.gov (United States)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  2. CrusDe: A plug-in based simulation framework for composable CRUStal DEformation simulations

    Science.gov (United States)

    Grapenthin, R.

    2008-12-01

    -in based approach of software component composition the user gets to sample the pleasures of software reuse in a simulation framework: a XML model definition allows for quick parameter adjustment and --more importantly-- straightforward exchange of model elements. The generic communication layer coupled with a plug-in mechanism furthermore allow the user to implement new plug-ins. Here CrusDe supports the reuse of existing components within these new implementations. In this way CrusDe can be adjusted to new flavors of equation~1 without the necessity to re-implement all of the new formal model. An example would be to replace the currently included flat Earth approximations by Green's functions that support spherical geometries. The advantages of the presented approach for software reuse go beyond the comforts for single users who will spent less time on test and validation of model formulations. Opportunities to increase the transparency of research open up since CrusDe is freely available as open source software for Linux/Unix platforms and model formulations are compact enough to include them in publications. Hence, the possibilities for reproduction of model results are greatly enhanced. Due to its modular architecture parts or the whole of CrusDe could become part of other projects. For example, a free repository of Green's functions can evolve or CrusDe could be used as an isostasy module in a larger modeling context. class="ab'>

  3. Simulation-based Discovery of Cyclic Peptide Nanotubes

    Science.gov (United States)

    Ruiz Pestana, Luis A.

    Today, there is a growing need for environmentally friendly synthetic membranes with selective transport capabilities to address some of society's most pressing issues, such as carbon dioxide pollution, or access to clean water. While conventional membranes cannot stand up to the challenge, thin nanocomposite membranes, where vertically aligned subnanometer pores (e.g. nanotubes) are embedded in a thin polymeric film, promise to overcome some of the current limitations, namely, achieving a monodisperse distribution of subnanometer size pores, vertical pore alignment across the membrane thickness, and tunability of the pore surface chemistry. Self-assembled cyclic peptide nanotubes (CPNs), are particularly promising as selective nanopores because the pore size can be controlled at the subnanometer level, exhibit high chemical design flexibility, and display remarkable mechanical stability. In addition, when conjugated with polymer chains, the cyclic peptides can co-assemble in block copolymer domains to form nanoporous thin films. CPNs are thus well positioned to tackle persistent challenges in molecular separation applications. However, our poor understanding of the physics underlying their remarkable properties prevents the rational design and implementation of CPNs in technologically relevant membranes. In this dissertation, we use a simulation-based approach, in particular molecular dynamics (MD) simulations, to investigate the critical knowledge gaps hindering the implementation of CPNs. Computational mechanical tests show that, despite the weak nature of the stabilizing hydrogen bonds and the small cross section, CPNs display a Young's modulus of approximately 20 GPa and a maximum strength of around 1 GPa, placing them among the strongest proteinaceous materials known. Simulations of the self-assembly process reveal that CPNs grow by self-similar coarsening, contrary to other low-dimensional peptide systems, such as amyloids, that are believed to grow through

  4. Multiscale dynamics based on kinetic simulation of collisionless magnetic reconnection

    Science.gov (United States)

    Fujimoto, Keizo; Takamoto, Makoto

    2016-07-01

    Magnetic reconnection is a natural energy converter which allows explosive energy release of the magnetic field energy into plasma kinetic energy. The reconnection processes inherently involve multi-scale process. The breaking of the field lines takes place predominantly in a small region called the diffusion region formed near the x-line, while the fast plasma jets resulting from reconnection extend to a distance far beyond the ion kinetic scales from the x-line. There has been a significant gap in understanding of macro-scale and micro-scale processes. The macro-scale model of reconnection has been developed using the magnetohydrodynamics (MHD) equations, while the micro-scale processes around the x-line have been based on kinetic equations including the ion and electron inertia. The problem is that these two kinds of model have significant discrepancies. It has been believed without any guarantee that the microscopic model near the x-line would connect to the macroscopic model far downstream of the x-line. In order to bridge the gap between the macro and micro-scale processes, we have performed large-scale particle-in-cell simulations with the adaptive mesh refinement. The simulation results suggest that the microscopic processes around the x-line do not connect to the previous MHD model even in the region far downstream of the x-line. The slow mode shocks and the associated plasma acceleration do not appear at the exhaust boundary of kinetic reconnection. Instead, the ions are accelerated due to the Speiser motion in the current layer extending to a distance beyond the kinetic scales. The different acceleration mechanisms between the ions and electrons lead to the Hall current system in broad area of the exhaust. Therefore, the previous MHD model could be inappropriate for collisionless magnetic reconnection. Ref. K. Fujimoto & M. Takamoto, Phys. Plasmas, 23, 012903 (2016).

  5. Action prediction based on anticipatory brain potentials during simulated driving

    Science.gov (United States)

    Khaliliardali, Zahra; Chavarriaga, Ricardo; Gheorghe, Lucian Andrei; Millán, José del R.

    2015-12-01

    Objective. The ability of an automobile to infer the driver’s upcoming actions directly from neural signals could enrich the interaction of the car with its driver. Intelligent vehicles fitted with an on-board brain-computer interface able to decode the driver’s intentions can use this information to improve the driving experience. In this study we investigate the neural signatures of anticipation of specific actions, namely braking and accelerating. Approach. We investigated anticipatory slow cortical potentials in electroencephalogram recorded from 18 healthy participants in a driving simulator using a variant of the contingent negative variation (CNV) paradigm with Go and No-go conditions: count-down numbers followed by ‘Start’/‘Stop’ cue. We report decoding performance before the action onset using a quadratic discriminant analysis classifier based on temporal features. Main results. (i) Despite the visual and driving related cognitive distractions, we show the presence of anticipatory event related potentials locked to the stimuli onset similar to the widely reported CNV signal (with an average peak value of -8 μV at electrode Cz). (ii) We demonstrate the discrimination between cases requiring to perform an action upon imperative subsequent stimulus (Go condition, e.g. a ‘Red’ traffic light) versus events that do not require such action (No-go condition; e.g. a ‘Yellow’ light); with an average single trial classification performance of 0.83 ± 0.13 for braking and 0.79 ± 0.12 for accelerating (area under the curve). (iii) We show that the centro-medial anticipatory potentials are observed as early as 320 ± 200 ms before the action with a detection rate of 0.77 ± 0.12 in offline analysis. Significance. We show for the first time the feasibility of predicting the driver’s intention through decoding anticipatory related potentials during simulated car driving with high recognition rates.

  6. Silvicultural decisions based on simulation-optimization systems

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Tianjian

    2010-05-15

    Forest management is facing new challenges under climate change. By adjusting thinning regimes, conventional forest management can be adapted to various objectives of utilization of forest resources, such as wood quality, forest bioenergy, and carbon sequestration. This thesis aims to develop and apply a simulation-optimization system as a tool for an interdisciplinary understanding of the interactions between wood science, forest ecology, and forest economics. In this thesis, the OptiFor software was developed for forest resources management. The OptiFor simulation-optimization system integrated the process-based growth model PipeQual, wood quality models, biomass production and carbon emission models, as well as energy wood and commercial logging models into a single optimization model. Osyczka s direct and random search algorithm was employed to identify optimal values for a set of decision variables. The numerical studies in this thesis broadened our current knowledge and understanding of the relationships between wood science, forest ecology, and forest economics. The results for timber production show that optimal thinning regimes depend on site quality and initial stand characteristics. Taking wood properties into account, our results show that increasing the intensity of thinning resulted in lower wood density and shorter fibers. The addition of nutrients accelerated volume growth, but lowered wood quality for Norway spruce. Integrating energy wood harvesting into conventional forest management showed that conventional forest management without energy wood harvesting was still superior in sparse stands of Scots pine. Energy wood from pre-commercial thinning turned out to be optimal for dense stands. When carbon balance is taken into account, our results show that changing carbon assessment methods leads to very different optimal thinning regimes and average carbon stocks. Raising the carbon price resulted in longer rotations and a higher mean annual

  7. Simulation of Mechanical Transmissions for Base Translation of an Industrial Robot

    OpenAIRE

    Dumitru Adrian Stefan; Calin-Octavian Miclosina

    2009-01-01

    This paper presents the simulation of 2 chained mechanical transmissions used to obtain the base translation of an industrial robot: worm - worm gear transmission and motion screw - nut transmission. For simulation, CATIA V5 software was used.

  8. Tunnel Engineering Construction Schedule Analysis and Management Based on Visual Simulation

    Institute of Scientific and Technical Information of China (English)

    SONG Yang; ZHONG Wei

    2007-01-01

    The methodology of visual simulation for a tunnel engineering construction schedule is presented. Visualization of simulation model, calculation and result of construction schedule simulation is realized, and the construction simulation and the resource optimization of tunnel engineering are made. A risk analysis and a decision-making method of tunnel engineering construction schedule based on visual simulation are presented. Furthermore, using S curve theory and schedule management method, the real-time management and control method of tunnel engineering construction based on visual simulation is presented. The application to the tunnel engineering construction schedule analysis and management shows the feasibility and effectiveness of the method presented in this paper.

  9. Numerical simulation and combination optimization of aluminum holding furnace linings based on simulated annealing☆

    Institute of Scientific and Technical Information of China (English)

    Jimin Wang; Shen Lan; Tao Chen; Wenke Li; Huaqiang Chu

    2015-01-01

    To reduce heat loss and save cost, a combination decision model of reverb aluminum holding furnace linings in aluminum casting industry was established based on economic thickness method, and was resolved using sim-ulated annealing. Meanwhile, a three-dimensional mathematical model of aluminum holding furnace linings was developed and integrated with user-defined heat load distribution regime model. The optimal combination was as follows:side wal with 80 mm alumino-silicate fiber felts, 232 mm diatomite brick and 116 mm chamotte brick;top wall with 50 mm clay castables, 110 mm alumino-silicate fiber felts and 200 mm refractory concrete;and bottom wal with 232 mm high-alumina brick, 60 mm clay castables and 68 mm diatomite brick. Lining tem-perature from high to low was successively bottom wal , side wal , and top wall. Lining temperature gradient in increasing order of magnitude was refractory layer and insulation layer. It was indicated that the results of com-bination optimization of aluminum holding furnace linings were valid and feasible, and its thermo-physical mechanism and cost characteristics were reasonably revealed.

  10. Photon dose estimation from ultraintense laser-solid interactions and shielding calculation with Monte Carlo simulation

    Science.gov (United States)

    Yang, Bo; Qiu, Rui; Li, JunLi; Lu, Wei; Wu, Zhen; Li, Chunyan

    2017-02-01

    When a strong laser beam irradiates a solid target, a hot plasma is produced and high-energy electrons are usually generated (the so-called "hot electrons"). These energetic electrons subsequently generate hard X-rays in the solid target through the Bremsstrahlung process. To date, only limited studies have been conducted on this laser-induced radiological protection issue. In this study, extensive literature reviews on the physics and properties of hot electrons have been conducted. On the basis of these information, the photon dose generated by the interaction between hot electrons and a solid target was simulated with the Monte Carlo code FLUKA. With some reasonable assumptions, the calculated dose can be regarded as the upper boundary of the experimental results over the laser intensity ranging from 1019 to 1021 W/cm2. Furthermore, an equation to estimate the photon dose generated from ultraintense laser-solid interactions based on the normalized laser intensity is derived. The shielding effects of common materials including concrete and lead were also studied for the laser-driven X-ray source. The dose transmission curves and tenth-value layers (TVLs) in concrete and lead were calculated through Monte Carlo simulations. These results could be used to perform a preliminary and fast radiation safety assessment for the X-rays generated from ultraintense laser-solid interactions.

  11. Component-based Discrete Event Simulation Using the Fractal Component Model

    OpenAIRE

    Dalle, Olivier

    2007-01-01

    In this paper we show that Fractal, a generic component model coming from the Component-Based Software Engineering (CBSE) community, meets most of the functional expectations identified so far in the simulation community for component-based modeling and simulation. We also demonstrate that Fractal offers additional features that have not yet been identified in the simulation community despite their potential usefulness. Eventually we describe our ongoing work on such a new simulation architec...

  12. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  13. Study on Visualization Simulation of Soybean Leaf Based on IFS

    Institute of Scientific and Technical Information of China (English)

    XING Lichao; SU Zhongbin; ZHENG Pin; JING Liqun

    2008-01-01

    This article applied the self-similarity of fractal theory to the soybean leaf with the aid of powerful iterative computation ability of computer, analyzed the generation principle of IFS code in the iterated function system, calculated the IFS code of the simulation soybean leaf. It basically realized the visualization simulation of soybean leaf and laid a foundation for the visualization simulation of the whole soybean plant.

  14. Believability in simplifications of large scale physically based simulation

    KAUST Repository

    Han, Donghui

    2013-01-01

    We verify two hypotheses which are assumed to be true only intuitively in many rigid body simulations. I: In large scale rigid body simulation, viewers may not be able to perceive distortion incurred by an approximated simulation method. II: Fixing objects under a pile of objects does not affect the visual plausibility. Visual plausibility of scenarios simulated with these hypotheses assumed true are measured using subjective rating from viewers. As expected, analysis of results supports the truthfulness of the hypotheses under certain simulation environments. However, our analysis discovered four factors which may affect the authenticity of these hypotheses: number of collisions simulated simultaneously, homogeneity of colliding object pairs, distance from scene under simulation to camera position, and simulation method used. We also try to find an objective metric of visual plausibility from eye-tracking data collected from viewers. Analysis of these results indicates that eye-tracking does not present a suitable proxy for measuring plausibility or distinguishing between types of simulations. © 2013 ACM.

  15. A MATLAB/Simulink based GUI for the CERES Simulator

    Science.gov (United States)

    Valencia, Luis H.

    1995-01-01

    The Clouds and The Earth's Radiant Energy System (CERES) simulator will allow flight operational familiarity with the CERES instrument prior to launch. It will provide a CERES instrument simulation facility for NASA Langley Research Center. NASA Goddard Space Flight Center and TRW. One of the objectives of building this simulator would be for use as a testbed for functionality checking of atypical memory uploads and for anomaly investigation. For instance, instrument malfunction due to memory damage requires troubleshooting on a simulator to determine the nature of the problem and to find a solution.

  16. A computer-based simulator of the atmospheric turbulence

    Science.gov (United States)

    Konyaev, Petr A.

    2015-11-01

    Computer software for modeling the atmospheric turbulence is developed on the basis of a time-varying random medium simulation algorithm and a split-step Fourier transform method for solving a wave propagation equation. A judicious choice of the simulator parameters, like the velocity of the evolution and motion of the medium, turbulence spectrum and scales, enables different effects of a random medium on the optical wavefront to be simulated. The implementation of the simulation software is shown to be simple and efficient due to parallel programming functions from the MKL Intel ® Parallel Studio libraries.

  17. Brownian-motion based simulation of stochastic reaction-diffusion systems for affinity based sensors.

    Science.gov (United States)

    Tulzer, Gerhard; Heitzinger, Clemens

    2016-04-22

    In this work, we develop a 2D algorithm for stochastic reaction-diffusion systems describing the binding and unbinding of target molecules at the surfaces of affinity-based sensors. In particular, we simulate the detection of DNA oligomers using silicon-nanowire field-effect biosensors. Since these devices are uniform along the nanowire, two dimensions are sufficient to capture the kinetic effects features. The model combines a stochastic ordinary differential equation for the binding and unbinding of target molecules as well as a diffusion equation for their transport in the liquid. A Brownian-motion based algorithm simulates the diffusion process, which is linked to a stochastic-simulation algorithm for association at and dissociation from the surface. The simulation data show that the shape of the cross section of the sensor yields areas with significantly different target-molecule coverage. Different initial conditions are investigated as well in order to aid rational sensor design. A comparison of the association/hybridization behavior for different receptor densities allows optimization of the functionalization setup depending on the target-molecule density.

  18. Brownian-motion based simulation of stochastic reaction-diffusion systems for affinity based sensors

    Science.gov (United States)

    Tulzer, Gerhard; Heitzinger, Clemens

    2016-04-01

    In this work, we develop a 2D algorithm for stochastic reaction-diffusion systems describing the binding and unbinding of target molecules at the surfaces of affinity-based sensors. In particular, we simulate the detection of DNA oligomers using silicon-nanowire field-effect biosensors. Since these devices are uniform along the nanowire, two dimensions are sufficient to capture the kinetic effects features. The model combines a stochastic ordinary differential equation for the binding and unbinding of target molecules as well as a diffusion equation for their transport in the liquid. A Brownian-motion based algorithm simulates the diffusion process, which is linked to a stochastic-simulation algorithm for association at and dissociation from the surface. The simulation data show that the shape of the cross section of the sensor yields areas with significantly different target-molecule coverage. Different initial conditions are investigated as well in order to aid rational sensor design. A comparison of the association/hybridization behavior for different receptor densities allows optimization of the functionalization setup depending on the target-molecule density.

  19. Models and Methods for Adaptive Management of Individual and Team-Based Training Using a Simulator

    Science.gov (United States)

    Lisitsyna, L. S.; Smetyuh, N. P.; Golikov, S. P.

    2017-05-01

    Research of adaptive individual and team-based training has been analyzed and helped find out that both in Russia and abroad, individual and team-based training and retraining of AASTM operators usually includes: production training, training of general computer and office equipment skills, simulator training including virtual simulators which use computers to simulate real-world manufacturing situation, and, as a rule, the evaluation of AASTM operators’ knowledge determined by completeness and adequacy of their actions under the simulated conditions. Such approach to training and re-training of AASTM operators stipulates only technical training of operators and testing their knowledge based on assessing their actions in a simulated environment.

  20. Simulation-based optimal Bayesian experimental design for nonlinear systems

    KAUST Repository

    Huan, Xun

    2013-01-01

    The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters.Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics. © 2012 Elsevier Inc.

  1. A simulator-based study of in-flight auscultation.

    Science.gov (United States)

    Tourtier, Jean-Pierre; Libert, Nicolas; Clapson, Patrick; Dubourdieu, Stéphane; Jost, Daniel; Tazarourte, Karim; Astaud, Cécil-Emmanuel; Debien, Bruno; Auroy, Yves

    2014-04-01

    The use of a stethoscope is essential to the delivery of continuous, supportive en route care during aeromedical evacuations. We compared the capability of 2 stethoscopes (electronic, Litmann 3000; conventional, Litmann Cardiology III) at detecting pathologic heart and lung sounds, aboard a C135, a medical transport aircraft. Sounds were mimicked using a mannequin-based simulator SimMan. Five practitioners examined the mannequin during a fly, with a variety of abnormalities as follows: crackles, wheezing, right and left lung silence, as well as systolic, diastolic, and Austin-Flint murmur. The comparison for diagnosis assessed (correct or wrong) between using the electronic and conventional stethoscopes were performed as a McNemar test. A total of 70 evaluations were performed. For cardiac sounds, diagnosis was right in 0/15 and 4/15 auscultations, respectively, with conventional and electronic stethoscopes (McNemar test, P = 0.13). For lung sounds, right diagnosis was found with conventional stethoscope in 10/20 auscultations versus 18/20 with electronic stethoscope (P = 0.013). Flight practitioners involved in aeromedical evacuation on C135 plane are more able to practice lung auscultation on a mannequin with this amplified stethoscope than with the traditional one. No benefit was found for heart sounds.

  2. Modelling and Simulation of a Biometric Identity-Based Cryptography

    Directory of Open Access Journals (Sweden)

    Dania Aljeaid

    2014-10-01

    Full Text Available Government information is a vital asset that must be kept in a trusted environment and efficiently managed by authorised parties. Even though e-Government provides a number of advantages, it also introduces a range of new security risks. Sharing confidential and top-secret information in a secure manner among government sectors tends to be the main element that government agencies look for. Thus, developing an effective methodology is essential and it is a key factor for e-Government success. The proposed e-Government scheme in this paper is a combination of identity-based encryption and biometric technology. This new scheme can effectively improve the security in authentication systems, which provides a reliable identity with a high degree of assurance. This paper also demonstrates the feasibility of using finite-state machines as a formal method to analyse the proposed protocols. Finally we showed how Petri Nets could be used to simulate the communication patterns between the server and client as well as to validate the protocol functionality.

  3. Complete agent based simulation of mini-grids

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez de Durana, J.M.; Barambones, O. [Univ. of the Basque Country, Vitoria (Spain). Dept. Ingeniera de Sistemas y Automatica; Kremers, E.; Viejo, P. [Karlsruhe Univ., Karlsruhe (Germany). European Inst. for Energy Research

    2009-07-01

    Most people without access to electricity live in remote and rural areas of developing countries where micro-grids have a high potential to actively participate in the electrification of such areas. Micro-grids may be fed by electricity from different renewable and conventional decentralized sources. There are several interesting hybrid renewable energy system (HRES) micro-grid applications, such as mobile equipment; autonomous equipment; small village electricity power supply; water pumping and irrigation systems; communications power supply; and mobile health emergency clinics. This paper presented a review of some of the main types of HRESs. The paper described an agent based model of a simple example of one such system, a micro-grid, oriented to designing a decentralized supervisor control. The model was implemented using AnyLogic and was primarily intended as a tool for design, development and demonstration of micro-grid control strategies. According to simulation results, the model could be used to analyze micro-grid design, operation strategies and economic benefits. 6 refs., 6 figs.

  4. Simulation-based education for building clinical teams

    Directory of Open Access Journals (Sweden)

    Marshall Stuart

    2010-01-01

    Full Text Available Failure to work as an effective team is commonly cited as a cause of adverse events and errors in emergency medicine. Until recently, individual knowledge and skills in managing emergencies were taught, without reference to the additional skills required to work as part of a team. Team training courses are now becoming commonplace, however their strategies and modes of delivery are varied. Just as different delivery methods of traditional education can result in different levels of retention and transfer to the real world, the same is true in team training of the material in different ways in traditional forms of education may lead to different levels of retention and transfer to the real world, the same is true in team training. As team training becomes more widespread, the effectiveness of different modes of delivery including the role of simulation-based education needs to be clearly understood. This review examines the basis of team working in emergency medicine, and the components of an effective emergency medical team. Lessons from other domains with more experience in team training are discussed, as well as the variations from these settings that can be observed in medical contexts. Methods and strategies for team training are listed, and experiences in other health care settings as well as emergency medicine are assessed. Finally, best practice guidelines for the development of team training programs in emergency medicine are presented.

  5. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    Science.gov (United States)

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  6. Simulation of game analysis based on an agent-based artificial stock market re-examined

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This work re-examined the simulation result of game analysis (Joshi et al., 2000) based on an agent-based model,Santa Fe Institute Artificial Stock Market. Allowing for recent research work on this artificial model, this paper's modified game simulations found that the dividend amplitude parameter is a crucial factor and that the original conclusion still holds in a not long period, but only when the dividend amplitude is large enough. Our explanation of this result is that the dividend amplitude parameter is a measurement of market uncertainty. The greater the uncertainty, the greater the price volatility, and so is the risk of investing in the stock market. The greater the risk, the greater the advantage of including technical rules.

  7. Simulations for plasma spectroscopy based on UTA theory

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The unresolved transition array(UTA) simulation with configurationaverage approximation is used to calculate the spectral properties ofplasmas involving complex ions. This method is used to simulate thetransmission of X-rays through aluminum plasma and niobium plasmarespectively. The results are compared with experiments and other results ofadvanced models and good agreements are obtained.

  8. A Methodology for Simulation-based Job Performance Assessment

    Science.gov (United States)

    2008-01-01

    Job performance measurement is of critical importance to any organization’s health. It is important not only to recognize and reward good performance...methodology for developing simulations for job performance assessment. We then describe a performance assessment simulation for Light-Wheeled Vehicle

  9. An Expanded C2-Simulation Experimental Environment Based on BML

    NARCIS (Netherlands)

    Pullen, J.M.; Heffner, K.; Khimeche, L.; Schade, U.; Reus, N.M.; Mevassvik, O.M.; Alstad, A.; Gomez-Veiga, R.; Cubero, S.G.; Brook, A.

    2010-01-01

    The NATO Modeling and Simulation Group Technical Activity 48 (MSG-048) was chartered in 2006 to investigate the potential of a Coalition Battle Management Language for multinational and NATO interoperation of command and control systems with simulation systems. Its work in defining and demonstrating

  10. Knowledge-based modeling of discrete-event simulation systems

    NARCIS (Netherlands)

    H. de Swaan Arons

    1999-01-01

    textabstractModeling a simulation system requires a great deal of customization. At first sight no system seems to resemble exactly another system and every time a new model has to be designed the modeler has to start from scratch. The present simulation languages provide the modeler with powerful

  11. Knowledge-based modeling of discrete-event simulation systems

    NARCIS (Netherlands)

    H. de Swaan Arons

    1999-01-01

    textabstractModeling a simulation system requires a great deal of customization. At first sight no system seems to resemble exactly another system and every time a new model has to be designed the modeler has to start from scratch. The present simulation languages provide the modeler with powerful t

  12. A Simulation Based on Goldratt's Matchstick/Die Game

    Science.gov (United States)

    Martin, Clarence H.

    2007-01-01

    This teaching brief presents a Microsoft® Excel simulation designed to complement and expand upon the well-known matchstick/die game introduced by Goldratt in "The Goal." This simulation performs 100 replications of a 40-period processing run for low, medium, and high levels of process variation and displays the comparative results…

  13. Development of a Robotics-based Satellites Docking Simulator

    NARCIS (Netherlands)

    Zebenay, M.

    2014-01-01

    The European Proximity Operation Simulator (EPOS) is a hardware-in-the-loop (HIL) system aiming, among other objectives, at emulating on-orbit docking of spacecraft for verification and validation of the docking phase. This HIL docking simulator set-up essentially consists of docking interfaces, sim

  14. Simulation Based Acquisition for NASA's Office of Exploration Systems

    Science.gov (United States)

    Hale, Joe

    2004-01-01

    In January 2004, President George W. Bush unveiled his vision for NASA to advance U.S. scientific, security, and economic interests through a robust space exploration program. This vision includes the goal to extend human presence across the solar system, starting with a human return to the Moon no later than 2020, in preparation for human exploration of Mars and other destinations. In response to this vision, NASA has created the Office of Exploration Systems (OExS) to develop the innovative technologies, knowledge, and infrastructures to explore and support decisions about human exploration destinations, including the development of a new Crew Exploration Vehicle (CEV). Within the OExS organization, NASA is implementing Simulation Based Acquisition (SBA), a robust Modeling & Simulation (M&S) environment integrated across all acquisition phases and programs/teams, to make the realization of the President s vision more certain. Executed properly, SBA will foster better informed, timelier, and more defensible decisions throughout the acquisition life cycle. By doing so, SBA will improve the quality of NASA systems and speed their development, at less cost and risk than would otherwise be the case. SBA is a comprehensive, Enterprise-wide endeavor that necessitates an evolved culture, a revised spiral acquisition process, and an infrastructure of advanced Information Technology (IT) capabilities. SBA encompasses all project phases (from requirements analysis and concept formulation through design, manufacture, training, and operations), professional disciplines, and activities that can benefit from employing SBA capabilities. SBA capabilities include: developing and assessing system concepts and designs; planning manufacturing, assembly, transport, and launch; training crews, maintainers, launch personnel, and controllers; planning and monitoring missions; responding to emergencies by evaluating effects and exploring solutions; and communicating across the OEx

  15. Measurement of LET (linear energy transfer) spectra using CR-39 at different depths of water irradiated by 171 MeV protons: A comparison with Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Sahoo, G.S. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Tripathy, S.P., E-mail: sam.tripathy@gmail.com [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Homi Bhabha National Institute, Mumbai 400094 (India); Molokanov, A.G.; Aleynikov, V.E. [Joint Institute for Nuclear Research, Dubna 141980 (Russian Federation); Sharma, S.D. [Homi Bhabha National Institute, Mumbai 400094 (India); Radiological Physics & Advisory Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Bandyopadhyay, T. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Homi Bhabha National Institute, Mumbai 400094 (India)

    2016-05-11

    In this work, we have used CR-39 detectors to estimate the LET (linear energy transfer) spectrum of secondary particles due to 171 MeV proton beam at different depths of water including the Bragg peak region. The measured LET spectra were compared with those obtained from FLUKA Monte Carlo simulation. The absorbed dose (D{sub LET}), dose equivalent (H{sub LET}) were estimated using the LET spectra. The values of D{sub LET} and H{sub LET} per incident proton fluence were found to increase with the increase in depth of water and were maximum at Bragg peak. - Highlights: • Measurement of LET spectrometry using CR-39 detectors at different depths of water. • Comparison of measured spectra with FLUKA Monte carlo simulation. • Absorbed dose and dose equivalent was found to increase with depth of water.

  16. Comparison of Flight Simulators Based on Human Motion Perception Metrics

    Science.gov (United States)

    Valente Pais, Ana R.; Correia Gracio, Bruno J.; Kelly, Lon C.; Houck, Jacob A.

    2015-01-01

    In flight simulation, motion filters are used to transform aircraft motion into simulator motion. When looking for the best match between visual and inertial amplitude in a simulator, researchers have found that there is a range of inertial amplitudes, rather than a single inertial value, that is perceived by subjects as optimal. This zone, hereafter referred to as the optimal zone, seems to correlate to the perceptual coherence zones measured in flight simulators. However, no studies were found in which these two zones were compared. This study investigates the relation between the optimal and the coherence zone measurements within and between different simulators. Results show that for the sway axis, the optimal zone lies within the lower part of the coherence zone. In addition, it was found that, whereas the width of the coherence zone depends on the visual amplitude and frequency, the width of the optimal zone remains constant.

  17. Research and Implementation of Distributed Virtual Simulation Platform Based on Components

    Institute of Scientific and Technical Information of China (English)

    SUN Zhi-xin; WANG Ru-chuan; WANG Shao-di

    2004-01-01

    This paper proposes a combination of system's theoretic simulation methodology with the virtual reality technology as a basis for a component-based virtual simulation framework. The created universal framework can be used in different fields, such as drive training, airplane fighting training, and so on. The result of the synergism is a powerful component-based virtual simulation framework. After having briefly introduced the concepts and principles of the distributed component object, the paper describes a software development method based on components. Then a method of virtual simulation system modeling based on components is proposed, and the integrated framework supporting distributed virtual simulation and its key technologies are discussed at length. Our experiments indicate that the framework can be widely used in simulation fields such as arms antagonism, driving simulation and so on.

  18. Simulation-Based Performance Evaluation of Predictive-Hashing Based Multicast Authentication Protocol

    Directory of Open Access Journals (Sweden)

    Seonho Choi

    2012-12-01

    Full Text Available A predictive-hashing based Denial-of-Service (DoS resistant multicast authentication protocol was proposed based upon predictive-hashing, one-way key chain, erasure codes, and distillation codes techniques [4, 5]. It was claimed that this new scheme should be more resistant to various types of DoS attacks, and its worst-case resource requirements were derived in terms of coarse-level system parameters including CPU times for signature verification and erasure/distillation decoding operations, attack levels, etc. To show the effectiveness of our approach and to analyze exact resource requirements in various attack scenarios with different parameter settings, we designed and implemented an attack simulator which is platformindependent. Various attack scenarios may be created with different attack types and parameters against a receiver equipped with the predictive-hashing based protocol. The design of the simulator is explained, and the simulation results are presented with detailed resource usage statistics. In addition, resistance level to various types of DoS attacks is formulated with a newly defined resistance metric. By comparing these results to those from another approach, PRABS [8], we show that the resistance level of our protocol is greatly enhanced even in the presence of many attack streams.

  19. Contribution to SER Prediction: A New Metric Based on RC Transient Simulations

    Science.gov (United States)

    Micolau, G.; Castellani-Coulie, K.; Aziza, H.; Portal, J.-M.

    2012-08-01

    This work focuses on speeding up simulation time of SEU systematic detection in a 90 nm SRAM cell. Simulations were run in order to validate a simplified approach based on the injection of a noise source current at the sensitive node of an analytical RC circuit. Moreover, a new SEU reliability metric, mandatory for reliability studies, is introduced. It is based on based on transient I-V simulations.

  20. Serious games experiment toward agent-based simulation

    Science.gov (United States)

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  1. Simulation based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  2. Computer-Based versus High-Fidelity Mannequin Simulation in Developing Clinical Judgment in Nursing Education

    Science.gov (United States)

    Howard, Beverly J.

    2013-01-01

    The purpose of this study was to determine if students learn clinical judgment as effectively using computer-based simulations as when using high-fidelity mannequin simulations. There was a single research questions for this study: What is the difference in clinical judgment between participants completing high-fidelity human simulator mannequin…

  3. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    Science.gov (United States)

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  4. Fidelity considerations for simulation-based usability assessments of mobile ICT for hospitals

    DEFF Research Database (Denmark)

    Dahl, Yngve; Alsos, Ole A; Svanæs, Dag

    2010-01-01

    training simulation fidelity theories. Based on a review of the training simulation literature, a set of fidelity dimensions through which training simulations are often adjusted to meet specific goals are identified. It is argued that the same mechanisms can be used in usability assessments of mobile ICT...

  5. Web-Based Distributed Simulation of Aeronautical Propulsion System

    Science.gov (United States)

    Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac

    2001-01-01

    An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.

  6. REVIEW OF FLEXIBLE MANUFACTURING SYSTEM BASED ON MODELING AND SIMULATION

    Directory of Open Access Journals (Sweden)

    SAREN Sanjib Kumar

    2016-05-01

    Full Text Available This paper focused on the literature survey of the use of flexible manufacturing system design and operation problems on the basis of simulation tools and their methodology which has been widely used for manufacturing system design and analysis. During this period, simulation has been proving to be an extremely useful analysis and optimization tool, and many articles, papers, and conferences have focused directly on the topic. This paper presents a scenario the use of simulation tools and their methodology in flexible manufacturing system from a period 1982 to 2015.

  7. The Future of Visual BASIC.NET Based Simulation for Industrial Automation: Distributed Control Systems

    Science.gov (United States)

    Tun, Hla Myo

    2008-10-01

    Visual Basic.net (VB.net) based simulation presents a unique opportunity for revolutionary changes in the process of developing simulation models and in the mission of the simulation software firms that provide tools to support the model development process. VB.net enables a new version of a simulation industry populated by application-specific simulation specialists who generate compatible and reusable simulation component. These object-oriented components can be developed using inexpensive and professional quality VB.net development environments. This discussion is an overview of the features and future benefits of VB.net based simulation. It is targeted at experienced simulation practitioners who understand the limitations of existing tools and the need for object-oriented, standardized and reusable modeling software.

  8. Simulation Platform for Wireless Sensor Networks Based on Impulse Radio Ultra Wide Band

    CERN Document Server

    Berthe, Abdoulaye; Dragomirescu, Daniela; Plana, Robert

    2010-01-01

    Impulse Radio Ultra Wide Band (IR-UWB) is a promising technology to address Wireless Sensor Network (WSN) constraints. However, existing network simulation tools do not provide a complete WSN simulation architecture, with the IR-UWB specificities at the PHYsical (PHY) and the Medium Access Control (MAC) layers. In this paper, we propose a WSN simulation architecture based on the IR-UWB technique. At the PHY layer, we take into account the pulse collision by dealing with the pulse propagation delay. We also modelled MAC protocols specific to IRUWB, for WSN applications. To completely fit the WSN simulation requirements, we propose a generic and reusable sensor and sensing channel model. Most of the WSN application performances can be evaluated thanks to the proposed simulation architecture. The proposed models are implemented on a scalable and well known network simulator: Global Mobile Information System Simulator (GloMoSim). However, they can be reused for all other packet based simulation platforms.

  9. AN AGENT-BASED SIMULATION ON MARKET CONSIDERING KNOWLEDGE TRANSITION AND SOCIAL IMPACT

    Institute of Scientific and Technical Information of China (English)

    Tieju Ma; Mina Ryoke; Yoshiteru Nakamori

    2002-01-01

    In this paper, an agent-based simulation about knowledge transition associated with social impact in market is introduced. In the simulation, the genetic algorithm is used to generate the next generation products and a dynamic social impact model is used to simulate how customers are influenced by other customers. The simulation and its results not only show some features and patterns of knowledge transition, but also explore and display some phenomena of business cultures. On the basis of the innovation model of knowledge-based economy, the transition between technical knowledge and products knowledge is discussed, and a fuzzy linear quantification model which can be used to simulate the transition is introduced.

  10. Simulation and measurement of the radiation field of the 1.4-GeV electron beam dump of the FERMI free-electron laser.

    Science.gov (United States)

    Fröhlich, Lars; Casarin, Katia; Vascotto, Alessandro

    2015-02-01

    The authors examine the radiation field produced in the vicinity of the main beam dump of the FERMI free-electron laser under the impact of a 1.4-GeV electron beam. Electromagnetic and neutron dose rates are calculated with the Fluka Monte Carlo code and compared with ionisation chamber and superheated drop detector measurements in various positions around the dump. Experimental data and simulation results are in good agreement with a maximum deviation of 25 % in a single location.

  11. CPR-based next-generation multiscale simulators

    NARCIS (Netherlands)

    Cusini, M.; Lukyanov, A.; Natvig, J.; Hajibeygi, H.

    2014-01-01

    Unconventional Reservoir simulations involve several challenges not only arising from geological heterogeneities, but also from strong nonlinear physical coupling terms. All exiting upscaling and multiscale methods rely on a classical sequential formulation to treat the coupling between the

  12. simulation based analysis on the effects of orientation on energy ...

    African Journals Online (AJOL)

    DEPT OF AGRICULTURAL ENGINEERING

    and simulated. Alternate north angles were used and results compared for the best orientation ... In warm and humid regions, solar radiation is diffused ... In this regard, the exact solar orientation .... tongue and groove panel system on a timber.

  13. SDG-based Model Validation in Chemical Process Simulation

    Institute of Scientific and Technical Information of China (English)

    张贝克; 许欣; 马昕; 吴重光

    2013-01-01

    Signed direct graph (SDG) theory provides algorithms and methods that can be applied directly to chemical process modeling and analysis to validate simulation models, and is a basis for the development of a soft-ware environment that can automate the validation activity. This paper is concentrated on the pretreatment of the model validation. We use the validation scenarios and standard sequences generated by well-established SDG model to validate the trends fitted from the simulation model. The results are helpful to find potential problems, as-sess possible bugs in the simulation model and solve the problem effectively. A case study on a simulation model of boiler is presented to demonstrate the effectiveness of this method.

  14. REVIEW OF FLEXIBLE MANUFACTURING SYSTEM BASED ON MODELING AND SIMULATION

    National Research Council Canada - National Science Library

    SAREN, Sanjib Kumar; TIBERIU, Vesselenyi

    2016-01-01

    This paper focused on the literature survey of the use of flexible manufacturing system design and operation problems on the basis of simulation tools and their methodology which has been widely used...

  15. An Investigation of Wavelet Bases for Grid-Based Multi-Scale Simulations Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Baty, R.S.; Burns, S.P.; Christon, M.A.; Roach, D.W.; Trucano, T.G.; Voth, T.E.; Weatherby, J.R.; Womble, D.E.

    1998-11-01

    The research summarized in this report is the result of a two-year effort that has focused on evaluating the viability of wavelet bases for the solution of partial differential equations. The primary objective for this work has been to establish a foundation for hierarchical/wavelet simulation methods based upon numerical performance, computational efficiency, and the ability to exploit the hierarchical adaptive nature of wavelets. This work has demonstrated that hierarchical bases can be effective for problems with a dominant elliptic character. However, the strict enforcement of orthogonality was found to be less desirable than weaker semi-orthogonality or bi-orthogonality for solving partial differential equations. This conclusion has led to the development of a multi-scale linear finite element based on a hierarchical change of basis. The reproducing kernel particle method has been found to yield extremely accurate phase characteristics for hyperbolic problems while providing a convenient framework for multi-scale analyses.

  16. Making the Case for Simulation-Based Assessments to Overcome the Challenges in Evaluating Clinical Competency.

    Science.gov (United States)

    Leigh, Gwen; Stueben, Frances; Harrington, Deedra; Hetherman, Stephen

    2016-05-13

    The use of simulation in nursing has increased substantially in the last few decades. Most schools of nursing have incorporated simulation into their curriculum but few are using simulation to evaluate clinical competency at the end of a semester or prior to graduation. Using simulation for such high stakes evaluation is somewhat novel to nursing. Educators are now being challenged to move simulation to the next level and use it as a tool for evaluating clinical competency. Can the use of simulation for high-stakes evaluation add to or improve our current evaluation methods? Using patient simulation for evaluation in contrast to a teaching modality has important differences that must be considered. This article discusses the difficulties of evaluating clinical competency, and makes the case for using simulation based assessment as a method of high stakes evaluation. Using simulation for high-stakes evaluation has the potential for significantly impacting nursing education.

  17. A new PET prototype for proton therapy: comparison of data and Monte Carlo simulations

    Science.gov (United States)

    Rosso, V.; Battistoni, G.; Belcari, N.; Camarlinghi, N.; Ferrari, A.; Ferretti, S.; Kraan, A.; Mairani, A.; Marino, N.; Ortuño, J. E.; Pullia, M.; Sala, P.; Santos, A.; Sportelli, G.; Straub, K.; Del Guerra, A.

    2013-03-01

    Ion beam therapy is a valuable method for the treatment of deep-seated and radio-resistant tumors thanks to the favorable depth-dose distribution characterized by the Bragg peak. Hadrontherapy facilities take advantage of the specific ion range, resulting in a highly conformal dose in the target volume, while the dose in critical organs is reduced as compared to photon therapy. The necessity to monitor the delivery precision, i.e. the ion range, is unquestionable, thus different approaches have been investigated, such as the detection of prompt photons or annihilation photons of positron emitter nuclei created during the therapeutic treatment. Based on the measurement of the induced β+ activity, our group has developed various in-beam PET prototypes: the one under test is composed by two planar detector heads, each one consisting of four modules with a total active area of 10 × 10 cm2. A single detector module is made of a LYSO crystal matrix coupled to a position sensitive photomultiplier and is read-out by dedicated frontend electronics. A preliminary data taking was performed at the Italian National Centre for Oncological Hadron Therapy (CNAO, Pavia), using proton beams in the energy range of 93-112 MeV impinging on a plastic phantom. The measured activity profiles are presented and compared with the simulated ones based on the Monte Carlo FLUKA package.

  18. Adaptive Simulated Annealing Based Protein Loop Modeling of Neurotoxins

    Institute of Scientific and Technical Information of China (English)

    陈杰; 黄丽娜; 彭志红

    2003-01-01

    A loop modeling method, adaptive simulated annealing, for ab initio prediction of protein loop structures, as an optimization problem of searching the global minimum of a given energy function, is proposed. An interface-friendly toolbox-LoopModeller in Windows and Linux systems, VC++ and OpenGL environments is developed for analysis and visualization. Simulation results of three short-chain neurotoxins modeled by LoopModeller show that the method proposed is fast and efficient.

  19. Sampling of general correlators in worm-algorithm based simulations

    Science.gov (United States)

    Rindlisbacher, Tobias; Åkerlund, Oscar; de Forcrand, Philippe

    2016-08-01

    Using the complex ϕ4-model as a prototype for a system which is simulated by a worm algorithm, we show that not only the charged correlator , but also more general correlators such as or , as well as condensates like , can be measured at every step of the Monte Carlo evolution of the worm instead of on closed-worm configurations only. The method generalizes straightforwardly to other systems simulated by worms, such as spin or sigma models.

  20. Sampling of General Correlators in Worm Algorithm-based Simulations

    CERN Document Server

    Rindlisbacher, Tobias; de Forcrand, Philippe

    2016-01-01

    Using the complex $\\phi^4$-model as a prototype for a system which is simulated by a (bosonic) worm algorithm, we show that not only the charged correlator $$, but also more general correlators such as $$ or $$ as well as condensates like $$ can be measured at every step of the Monte Carlo evolution of the worm instead of on closed-worm configurations only. The method generalizes straightforwardly to other systems simulated by (bosonic) worms, such as spin or sigma models.

  1. Investigation of Diesel Engine Performance Based on Simulation

    OpenAIRE

    Semin; Rosli A. Bakar; Abdul R. Ismail

    2008-01-01

    The single cylinder modeling and simulation for four-stroke direct-injection diesel engine requires the use of advanced analysis and development tools to carry out of performance the diesel engine model. The simulation and computational development of modeling for the research use the commercial of GT-SUITE 6.2 software. In this research, the one dimensional modeling of single cylinder for four-stroke direct-injection diesel engine developed. The analysis of the model is combustion performanc...

  2. Internet Based General Computer Simulation Platform for Distributed Multi-Robotic System

    Institute of Scientific and Technical Information of China (English)

    迟艳玲; 张斌; 王硕; 谭民

    2002-01-01

    A general computer simulation platform is designed for the purpose of catrrying out experiments on the Distributed Multi-Robotic System. The simulation platform is based on Internet and possesses generality, validity, real-time display and function of supporting algorithm developing. In addition, the platform is equipped wit recording and replay module, and simulation experiment can be reviewed at anytime.By now; a few algorithms have been developed on the Simulation Platform designed.

  3. Design and Realization of Avionics Integration Simulation System Based on RTX

    OpenAIRE

    2016-01-01

    Aircraft avionics system becoming more and more complicated, it is too hard to test and verify real avionics systems. A design and realization method of avionics integration simulation system based on RTX was brought forward to resolve the problem. In this simulation system, computer software and hardware resources were utilized entirely. All kinds of aircraft avionics system HIL (hardware-in-loop) simulations can be implemented in this platform. The simulation method provided the technical f...

  4. Learning curves and long-term outcome of simulation-based thoracentesis training for medical students

    Directory of Open Access Journals (Sweden)

    Chen Kezhong

    2011-06-01

    Full Text Available Abstract Background Simulation-based medical education has been widely used in medical skills training; however, the effectiveness and long-term outcome of simulation-based training in thoracentesis requires further investigation. The purpose of this study was to assess the learning curve of simulation-based thoracentesis training, study skills retention and transfer of knowledge to a clinical setting following simulation-based education intervention in thoracentesis procedures. Methods Fifty-two medical students were enrolled in this study. Each participant performed five supervised trials on the simulator. Participant's performance was assessed by performance score (PS, procedure time (PT, and participant's confidence (PC. Learning curves for each variable were generated. Long-term outcome of the training was measured by the retesting and clinical performance evaluation 6 months and 1 year, respectively, after initial training on the simulator. Results Significant improvements in PS, PT, and PC were noted among the first 3 to 4 test trials (p 0.05. Clinical competency in thoracentesis was improved in participants who received simulation training relative to that of first year medical residents without such experience (p Conclusions This study demonstrates that simulation-based thoracentesis training can significantly improve an individual's performance. The saturation of learning from the simulator can be achieved after four practice sessions. Simulation-based training can assist in long-term retention of skills and can be partially transferred to clinical practice.

  5. Neutron fluence in antiproton radiotherapy, measurements and simulations

    DEFF Research Database (Denmark)

    Bassler, Niels; Holzscheiter, Michael H.; Petersen, Jørgen B.B.

    2010-01-01

    A significant part of the secondary particle spectrum from antiproton annihilation consists of fast neutrons, which may contribute to a significant dose background found outside the primary beam. Using a polystyrene phantom as a moderator, we have performed absolute measurements of the thermalized...... part of the fast neutron spectrum using Lithium-6 and -7 Fluoride TLD pairs. The experimental results are found to be in good agreement with simulations using the Monte Carlo particle transport code FLUKA. The thermal neutron kerma resulting from the measured thermal neutron fluence is insignificant...

  6. Simulation of induced radioactivity for Heavy Ion Medical Machine

    CERN Document Server

    Jun-Kui, Xu; Wu-Yuan, Li; Wang, Mao; Jia-Wen, Xia; Xi-Meng, Chen; Wei-Wei, Yan; Chong, Xu

    2013-01-01

    For radiation protection and environmental impact assessment purpose, the radioactivity induced by carbon ion of Heavy Ion Medical Machine (HIMM) was studied. Radionuclides in accelerator component, cooling water and air at target area which are induced from primary beam and secondary particles are simulated by FLUKA Monte Carlo code. It is found that radioactivity in cooling water and air is not very important at the required beam intensity and energy which is needed for treatment, radionuclides in accelerator component may cause some problem for maintenance work, suitable cooling time is needed after the machine are shut down.

  7. Guidance Provided by Teacher and Simulation for Inquiry-Based Learning: a Case Study

    Science.gov (United States)

    Lehtinen, Antti; Viiri, Jouni

    2017-04-01

    Current research indicates that inquiry-based learning should be guided in order to achieve optimal learning outcomes. The need for guidance is even greater when simulations are used because of their high information content and the difficulty of extracting information from them. Previous research on guidance for learning with simulations has concentrated on guidance provided by the simulation. Little research has been done on the role of the teacher in guiding learners with inquiry-based activities using simulations. This descriptive study focuses on guidance provided during small group investigations; pre-service teachers ( n = 8) guided third and fifth graders using a particular simulation. Data was collected using screen capture videos. The data was analyzed using a combination of theory- and data-driven analysis. Forms of guidance provided by the simulation and by the teachers were divided into the same categories. The distribution of the guidance between the teacher and the simulation was also analyzed. The categories for forms of guidance provided by simulations proved to be applicable to guidance provided by the teachers as well. Teachers offered more various forms of guidance than the simulation. The teachers adapted their guidance and used different patterns to complement the guidance provided by the simulation. The results of the study show that guidance provided by teachers and simulations have different affordances, and both should be present in the classroom for optimal support of learning. This has implications for both teaching with simulations and development of new simulations.

  8. Guidance Provided by Teacher and Simulation for Inquiry-Based Learning: a Case Study

    Science.gov (United States)

    Lehtinen, Antti; Viiri, Jouni

    2016-12-01

    Current research indicates that inquiry-based learning should be guided in order to achieve optimal learning outcomes. The need for guidance is even greater when simulations are used because of their high information content and the difficulty of extracting information from them. Previous research on guidance for learning with simulations has concentrated on guidance provided by the simulation. Little research has been done on the role of the teacher in guiding learners with inquiry-based activities using simulations. This descriptive study focuses on guidance provided during small group investigations; pre-service teachers (n = 8) guided third and fifth graders using a particular simulation. Data was collected using screen capture videos. The data was analyzed using a combination of theory- and data-driven analysis. Forms of guidance provided by the simulation and by the teachers were divided into the same categories. The distribution of the guidance between the teacher and the simulation was also analyzed. The categories for forms of guidance provided by simulations proved to be applicable to guidance provided by the teachers as well. Teachers offered more various forms of guidance than the simulation. The teachers adapted their guidance and used different patterns to complement the guidance provided by the simulation. The results of the study show that guidance provided by teachers and simulations have different affordances, and both should be present in the classroom for optimal support of learning. This has implications for both teaching with simulations and development of new simulations.

  9. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    Science.gov (United States)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  10. Hadron therapy physics and simulations

    CERN Document Server

    d’Ávila Nunes, Marcos

    2014-01-01

    This brief provides an in-depth overview of the physics of hadron therapy, ranging from the history to the latest contributions to the subject. It covers the mechanisms of protons and carbon ions at the molecular level (DNA breaks and proteins 53BP1 and RPA), the physics and mathematics of accelerators (Cyclotron and Synchrotron), microdosimetry measurements (with new results so far achieved), and Monte Carlo simulations in hadron therapy using FLUKA (CERN) and MCHIT (FIAS) software. The text also includes information about proton therapy centers and carbon ion centers (PTCOG), as well as a comparison and discussion of both techniques in treatment planning and radiation monitoring. This brief is suitable for newcomers to medical physics as well as seasoned specialists in radiation oncology.

  11. Dynamic Multicast Grouping Approach in HLA-Based Distributed Interactive Simulation

    Institute of Scientific and Technical Information of China (English)

    DAI Zhong-jian; HOU Chao-zhen

    2005-01-01

    In order to improve the efficiency of data distributed management service in distributed interactive simulation based on high level architecture (HLA) and to reduce the network traffic and save the system resource, the approaches of multicast grouping in HLA-based distributed interactive simulation are discussed. Then a new dynamic multicast grouping approach is proposed. This approach is based on the current publication and subscription region in the process of simulation. The results of simulation experiment show that this approach can significantly reduce the message overhead and use fewer multicast groups.

  12. SIMULATION MODEL BASED ON REGIONAL DEVELOPMENT AND VIRTUAL CHANGES

    Directory of Open Access Journals (Sweden)

    Petr Dlask

    2015-10-01

    Full Text Available This paper reports on change as an indicator that can be provide more focused goals in studies of development. The paper offers an answer to the question: How might management gain information from a simulation model and thus influence reality through pragmatic changes. We focus on where and when to influence, manage, and control basic technical-economic proposals. These proposals are mostly formed as simulation models. Unfortunately, however, they do not always provide an explanation of formation changes. A wide variety of simulation tools have become available, e.g. Simulink, Wolfram SystemModeler, VisSim, SystemBuild, STELLA, Adams, SIMSCRIPT, COMSOL Multiphysics, etc. However, there is only limited support for the construction of simulation models of a technical-economic nature. Mathematics has developed the concept of differentiation. Economics has developed the concept of marginality. Technical-economic design has yet to develop an equivalent methodology. This paper discusses an,alternative approach that uses the phenomenon of change, and provides a way from professional knowledge, which can be seen as a purer kind of information, to a more dynamic computing model (a simulation model that interprets changes as method. The validation of changes, as a result for use in managerial decision making, and condition for managerial decision making, can thus be improved.

  13. Multi-agent Based Hierarchy Simulation Models of Carrier-based Aircraft Catapult Launch

    Institute of Scientific and Technical Information of China (English)

    Wang Weijun; Qu Xiangju; Guo Linliang

    2008-01-01

    With the aid of multi-agent based modeling approach to complex systems,the hierarchy simulation models of carrier-based aircraft catapult launch are developed.Ocean,carrier,aircraft,and atmosphere are treated as aggregation agents,the detailed components like catapult,landing gears,and disturbances are considered as meta-agents,which belong to their aggregation agent.Thus,the model with two layers is formed i.e.the aggregation agent layer and the meta-agent layer.The information communication among all agents is described.The meta-agents within one aggregation agent communicate with each other directly by information sharing,but the meta-agents,which belong to different aggregation agents exchange their information through the aggregation layer fast,and then perceive it from the sharing environment,that is the aggregation agent.Thus,not only the hierarchy model is built,but also the environment perceived by each agent is specified.Meanwhile,the problem of balancing the independency of agent and the resource consumption brought by real-time communication within multi-agent system (MAS) is resolved.Each agent involved in carrier-based aircraft catapult launch is depicted,with considering the interaction within disturbed atmospheric environment and multiple motion bodies including carrier,aircraft,and landing gears.The models of reactive agents among them are derived based on tensors,and the perceived messages and inner frameworks of each agent are characterized.Finally,some results of a simulation instance are given.The simulation and modeling of dynamic system based on multi-agent system is of benefit to express physical concepts and logical hierarchy clearly and precisely.The system model can easily draw in kinds of other agents to achieve a precise simulation of more complex system.This modeling technique makes the complex integral dynamic equations of multibodies decompose into parallel operations of single agent,and it is convenient to expand,maintain,and reuse

  14. The emerging role of screen based simulators in the training and assessment of colonoscopists

    OpenAIRE

    Cunningham, Morven; Fernando, Bimbi; Berlingieri, Pasquale

    2010-01-01

    Incorporation of screen based simulators into medical training has recently gained momentum, as advances in technology have coincided with a government led drive to increase the use of medical simulation training to improve patient safety with progressive reductions in working hours available for junior doctors to train. High fidelity screen based simulators hold great appeal for endoscopy training. Potentially, their incorporation into endoscopy training curricula could enhance speed of acqu...

  15. Particle based plasma simulation for an ion engine discharge chamber

    Science.gov (United States)

    Mahalingam, Sudhakar

    Design of the next generation of ion engines can benefit from detailed computer simulations of the plasma in the discharge chamber. In this work a complete particle based approach has been taken to model the discharge chamber plasma. This is the first time that simplifying continuum assumptions on the particle motion have not been made in a discharge chamber model. Because of the long mean free paths of the particles in the discharge chamber continuum models are questionable. The PIC-MCC model developed in this work tracks following particles: neutrals, singly charged ions, doubly charged ions, secondary electrons, and primary electrons. The trajectories of these particles are determined using the Newton-Lorentz's equation of motion including the effects of magnetic and electric fields. Particle collisions are determined using an MCC statistical technique. A large number of collision processes and particle wall interactions are included in the model. The magnetic fields produced by the permanent magnets are determined using Maxwell's equations. The electric fields are determined using an approximate input electric field coupled with a dynamic determination of the electric fields caused by the charged particles. In this work inclusion of the dynamic electric field calculation is made possible by using an inflated plasma permittivity value in the Poisson solver. This allows dynamic electric field calculation with minimal computational requirements in terms of both computer memory and run time. In addition, a number of other numerical procedures such as parallel processing have been implemented to shorten the computational time. The primary results are those modeling the discharge chamber of NASA's NSTAR ion engine at its full operating power. Convergence of numerical results such as total number of particles inside the discharge chamber, average energy of the plasma particles, discharge current, beam current and beam efficiency are obtained. Steady state results for

  16. Vehicle Crashworthiness Simulation Based on Virtual Design of Autobody

    Institute of Scientific and Technical Information of China (English)

    张晓云; 金先龙; 孙奕; 林忠钦; 周长英; 艾维全; 王仕达

    2004-01-01

    Vehicle crashworthiness simulation is the main component of the virtual auto-body design. One developing commercial vehicle was simulated on crashworthiness by the non-linear finite element method. The bumper crashworthiness at the speed of 8 km/h was analyzed and valuated. On the other hand, the deformation of the auto-body, the movement of the steering wheel and the dynamic responses of the occupant at the initial velocity of 50 km/h were studied. The results appear that the design of the vehicle could be improved on structure and material. Finally, the frontal longitudinal beam, the main energy-absorbing part of the auto-body, was optimized on structure. Simulation results also show that applying new material, such as high strength steel, and new manufacture techniques, such as tailor-welded blanks could improve the crashworthiness of the vehicle greatly.

  17. Fluent-based numerical simulation of flow centrifugal fan

    Institute of Scientific and Technical Information of China (English)

    LI Xian-zhang

    2011-01-01

    Testing centrifugal fan flow field by physical laboratory is difficult because the testing system is complex and the workload is heavy, and the results observed by naked-eye deviates far from the actual value. To address this problem, the computational fluid dynamics software FLUENT was applied to establish three-dimensional model of the centrifugal fan. The numeral model was verified by comparing simulation data to experimental data. The pressure centrifugal fan and the speed changes in distribution in centrifugal fan was simulated by computational fluid dynamics software FLUENT. The simulation results show that the gas flow velocity in the impeller increases with impeller radius increase. Static pressure gradually increases when gas from the fan access is imported through fan impeller leaving fans.

  18. Wandering crowd simulation based on space syntax theory

    Institute of Scientific and Technical Information of China (English)

    ZHENG Liping; SUN Chen; LIU Li; WANG Lin

    2012-01-01

    Space syntax has proven there appears to be a fundamental process that informs human and social usage of an environ- ment, and the effects of spatial configuration on movement patterns are consistent with a model of individual decision behavior. In- troducing space syntax to crowd simulation will enable space structure guide the random movement of the crowd with no specific targets. This paper proposes a simulation method of the wandering crowd, which calculates the crowd distribution corresponding to the space through space syntax and uses a hybrid path planning algorithm to dynamically navigate the crowd to conform to the dis- tribution. Experiments show the presented method can obtain reasonable and vision-realistic simulation results.

  19. EWS based visual and interactive simulator for plant engineering

    Energy Technology Data Exchange (ETDEWEB)

    Ohtsuka, Shiroh [Toshiba Corp. (Japan). Isogo Nuclear Engineering Center; Tanaka, Kazuma; Yoshikawa, Eiji [Toshiba Corp. (Japan). Nuclear Engineering Lab.

    1994-12-31

    The `Plant Engineering Visual and Interactive Simulator (PLEVIS)` is a realtime plant engineering simulator and runs on a general-purpose desk-top engineering workstation with a high-resolution bit-mapped display. PLEVIS is unique in that simulation models are integrated with a control/interlock model editor. PLEVIS can be used in a wide variety of applications, some of which are: (1) Design and modification studies of a control and interlock system, (2) Plant response evaluation for plant start-up testing and troubleshooting, (3) Transient recognition and mitigation studies, and (4) Familiarization with the plant process and control/ interlock system concept. The basic features of PLEVIS in order to realize the above applications are described in the presentation. (1 ref., 6 figs.).

  20. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  1. Utilizing Simulation-Based Training of Video Clip Instruction for the Store Service Operations Practice Course

    Science.gov (United States)

    Lin, Che-Hung; Yen, Yu-Ren; Wu, Pai-Lu

    2015-01-01

    The aim of this study was to develop a store service operations practice course based on simulation-based training of video clip instruction. The action research of problem-solving strategies employed for teaching are by simulated store operations. The counter operations course unit used as an example, this study developed 4 weeks of subunits for…

  2. Motivational Effect of Web-Based Simulation Game in Teaching Operations Management

    Science.gov (United States)

    Nguyen, Tung Nhu

    2015-01-01

    Motivational effects during a simulated educational game should be studied because a general concern of lecturers is motivating students and increasing their knowledge. Given advances in internet technology, traditional short in-class games are being substituted with long web-based games. To maximize the benefits of web-based simulation games, a…

  3. Teaching Business Strategy for an Emerging Economy: An Internet-Based Simulation.

    Science.gov (United States)

    Miller, Van V.

    2003-01-01

    Describes an Internet-based simulation used in a course about business strategy in an emerging economy. The simulation, when coupled with today's dominant strategy paradigm, the Resource Based View, appears to yield a course design that attracts students while emphasizing the actual substance which is crucial in such a course. (EV)

  4. Effectiveness of Simulation-Based Education on Childhood Fever Management by Taiwanese Parents

    Directory of Open Access Journals (Sweden)

    Li-Chuan Chang

    2016-12-01

    Conclusion: Simulation-based education, compared to using the brochure, was a better strategy for improving parental information, motivation, behavioral skills, and behaviors regarding fever management. We suggest that providing community-based education on fever with scenario simulation is needed to increase parental competence for child care.

  5. Developing Clinical Competency in Crisis Event Management: An Integrated Simulation Problem-Based Learning Activity

    Science.gov (United States)

    Liaw, S. Y.; Chen, F. G.; Klainin, P.; Brammer, J.; O'Brien, A.; Samarasekera, D. D.

    2010-01-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session…

  6. Discovery Learning, Representation, and Explanation within a Computer-Based Simulation: Finding the Right Mix

    Science.gov (United States)

    Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly

    2004-01-01

    The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…

  7. Developing Clinical Competency in Crisis Event Management: An Integrated Simulation Problem-Based Learning Activity

    Science.gov (United States)

    Liaw, S. Y.; Chen, F. G.; Klainin, P.; Brammer, J.; O'Brien, A.; Samarasekera, D. D.

    2010-01-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session…

  8. Mechanism Modeling and Simulation Based on Dimensional Deviation

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    To analyze the effects on motion characteristics of mechanisms of dimensional variations, a study on random dimensional deviation generation techniques for 3D models on the basis of the present mechanical modeling software was carried out, which utilized the redeveloped interfaces provided by the modeling software to develop a random dimensional deviation generation system with certain probability distribution characteristics. This system has been used to perform modeling and simulation of the specific mechanical time delayed mechanism under multiple deviation varieties, simulation results indicate the dynamic characteristics of the mechanism are influenced significantly by the dimensional deviation in the tolerance distribution range, which should be emphasized in the design.

  9. Sampling of general correlators in worm-algorithm based simulations

    Directory of Open Access Journals (Sweden)

    Tobias Rindlisbacher

    2016-08-01

    Full Text Available Using the complex ϕ4-model as a prototype for a system which is simulated by a worm algorithm, we show that not only the charged correlator 〈ϕ⁎(xϕ(y〉, but also more general correlators such as 〈|ϕ(x||ϕ(y|〉 or 〈arg⁡(ϕ(xarg⁡(ϕ(y〉, as well as condensates like 〈|ϕ|〉, can be measured at every step of the Monte Carlo evolution of the worm instead of on closed-worm configurations only. The method generalizes straightforwardly to other systems simulated by worms, such as spin or sigma models.

  10. Simulation of concrete perforation based on a continuum damage model

    Energy Technology Data Exchange (ETDEWEB)

    Chen, E.P. [Sandia National Labs., Albuquerque, NM (United States). Solid and Structural Mechanics Dept.

    1994-10-01

    Numerical simulation of dynamic fracture of concrete slabs, impacted by steel projectiles, was carried out in this study. The concrete response was described by a continuum damage model. This continuum damage model was originally developed to study rock fragmentation and was modified in the present study with an emphasis on the post-limit structural response. The model was implemented into a transient dynamic explicit finite element code LS-DYNA2D and the code was then used for the numerical simulations. The specific impact configuration of this study follows the experiment series conducted by Hanchak et al. Comparisons between calculated results and measured data were made. Good agreements were found.

  11. Digital Simulation of Space Vector Modulation Based Induction Motor Drive

    Directory of Open Access Journals (Sweden)

    G.V. Siva Krishna Rao and T.S. Surendra

    2011-04-01

    Full Text Available This study deals with simulation of Space vector modulated inverter fed induction motor drive. The drive system is modeled using matlab simulink and the results are presented. This drive has advantages like reduced harmonics and heating. Fixed AC is converted into DC and this DC is converted into variable voltage and variable frequency AC using SVM inverter. The output of SVM is applied to the stator of induction motor. The simulation results are compared with the analytical results. The FFT analysis shows that the current spectrum has reduced harmonics compared to the conventional system.

  12. Simulation Model of Magnetic Levitation Based on NARX Neural Networks

    Directory of Open Access Journals (Sweden)

    Dragan Antić

    2013-04-01

    Full Text Available In this paper, we present analysis of different training types for nonlinear autoregressive neural network, used for simulation of magnetic levitation system. First, the model of this highly nonlinear system is described and after that the Nonlinear Auto Regressive eXogenous (NARX of neural network model is given. Also, numerical optimization techniques for improved network training are described. It is verified that NARX neural network can be successfully used to simulate real magnetic levitation system if suitable training procedure is chosen, and the best two training types, obtained from experimental results, are described in details.

  13. Electromagnetic Simulations of Helical-Based Ion Acceleration Structures

    CERN Document Server

    Nelson, Scott D; Caporaso, George; Friedman, Alex; Poole, Brian R; Waldron, William

    2005-01-01

    Helix structures have been proposed* for accelerating low energy ion beams using MV/m fields in order to increase the coupling effeciency of the pulsed power system and to tailor the electromagnetic wave propagation speed with the particle beam speed as the beam gains energy. Calculations presented here show the electromagnetic field as it propagates along the helix structure, field stresses around the helix structure (for voltage breakdown determination), optimizations to the helix and driving pulsed power waveform, and simulations showing test particles interacting with the simulated time varying fields.

  14. ModelforAnalyzing Human Communication Network Based onAgent-Based Simulation

    Science.gov (United States)

    Matsuyama, Shinako; Terano, Takao

    This paper discusses dynamic properties of human communications networks, which appears as a result of informationexchanges among people. We propose agent-based simulation (ABS) to examine implicit mechanisms behind the dynamics. The ABS enables us to reveal the characteristics and the differences of the networks regarding the specific communicationgroups. We perform experiments on the ABS with activity data from questionnaires survey and with virtual data which isdifferent from the activity data. We compare the difference between them and show the effectiveness of the ABS through theexperiments.

  15. Estimating genetic correlations based on phenotypic data: a simulation-based method

    Indian Academy of Sciences (India)

    Elias Zintzaras

    2011-04-01

    Knowledge of genetic correlations is essential to understand the joint evolution of traits through correlated responses to selection, a difficult and seldom, very precise task even with easy-to-breed species. Here, a simulation-based method to estimate genetic correlations and genetic covariances that relies only on phenotypic measurements is proposed. The method does not require any degree of relatedness in the sampled individuals. Extensive numerical results suggest that the propose method may provide relatively efficient estimates regardless of sample sizes and contributions from common environmental effects.

  16. Computer-based or human patient simulation-based case analysis: which works better for teaching diagnostic reasoning skills?

    Science.gov (United States)

    Wilson, Rebecca D; Klein, James D; Hagler, Debra

    2014-01-01

    The purpose of this study was to determine whether a difference exists in learner performance and the type and frequency of diagnostic reasoning skills used, based on the method of case presentation. Faculty can select from a variety of methods for presenting cases when teaching diagnostic reasoning, but little evidence exists with regard to how students use these skills while interacting with the cases. A total of 54 nursing students participated in two case analyses using human patient and computer-based simulations. Participant performance and diagnostic reasoning skills were analyzed. Performance was significantly better with the human patient simulation case. All diagnostic reasoning skills were used during both methods of case presentation, with greater performance variation in the computer-based simulation. Both human patient and computer-based simulations are beneficial for practicing diagnostic reasoning skills; however, these findings support the use of human patient simulations for improving student performance in case synthesis.

  17. Real-Time Density-Based Crowd Simulation

    NARCIS (Netherlands)

    van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.

    2012-01-01

    Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid

  18. Realistic Crowd Simulation with Density-Based Path Planning

    NARCIS (Netherlands)

    van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.

    2012-01-01

    Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid

  19. A process-based algorithm for simulating terraces in SWAT

    Science.gov (United States)

    Terraces in crop fields are one of the most important soil and water conservation measures that affect runoff and erosion processes in a watershed. In large hydrological programs such as the Soil and Water Assessment Tool (SWAT), terrace effects are simulated by adjusting the slope length and the US...

  20. A Simulation Analysis of Work Based Navy Manpower Requirements

    Science.gov (United States)

    2012-09-01

    Hill, L. Monch, O. Rose, T. Jefferson, J.W. Fowler (eds.), Proceedings of the 2008 Winter Simulation Conference (pp 73–84). Miami, FL: Thomson Reuters ...Conference (pp 541–544). Atlanta, GA: Thomson Reuters . U.S. Government Accountability Office. (2010). Navy Needs to Reassess Its Metrics and Assumptions