WorldWideScience

Sample records for fluka based simulation

  1. Hadron production simulation by FLUKA

    CERN Document Server

    Battistoni, G; Ferrari, A; Ranft, J; Roesler, S; Sala, P R

    2013-01-01

    For the purposes of accelerator based neutrino experiments, the simulation of parent hadron production plays a key role. In this paper a quick overview of the main ingredients of the PEANUT event generator implemented in the FLUKA Monte Carlo code is given, together with some benchmarking examples.

  2. Design of 6 Mev linear accelerator based pulsed thermal neutron source: FLUKA simulation and experiment

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T.; Pethe, S.N.; Krishnan, R. [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Bhoraskar, V.N. [Department of Physics, University of Pune, Pune 411 007 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India)

    2012-01-15

    The 6 MeV LINAC based pulsed thermal neutron source has been designed for bulk materials analysis. The design was optimized by varying different parameters of the target and materials for each region using FLUKA code. The optimized design of thermal neutron source gives flux of 3 Multiplication-Sign 10{sup 6}ncm{sup -2}s{sup -1} with more than 80% of thermal neutrons and neutron to gamma ratio was 1 Multiplication-Sign 10{sup 4}ncm{sup -2}mR{sup -1}. The results of prototype experiment and simulation are found to be in good agreement with each other. - Highlights: Black-Right-Pointing-Pointer The optimized 6 eV linear accelerator based thermal neutron source using FLUKA simulation. Black-Right-Pointing-Pointer Beryllium as a photonuclear target and reflector, polyethylene as a filter and shield, graphite as a moderator. Black-Right-Pointing-Pointer Optimized pulsed thermal neutron source gives neutron flux of 3 Multiplication-Sign 10{sup 6} n cm{sup -2} s{sup -1}. Black-Right-Pointing-Pointer Results of the prototype experiment were compared with simulations and are found to be in good agreement. Black-Right-Pointing-Pointer This source can effectively be used for the study of bulk material analysis and activation products.

  3. Development of a space radiation Monte Carlo computer simulation based on the FLUKA and ROOT codes

    CERN Document Server

    Pinsky, L; Ferrari, A; Sala, P; Carminati, F; Brun, R

    2001-01-01

    This NASA funded project is proceeding to develop a Monte Carlo-based computer simulation of the radiation environment in space. With actual funding only initially in place at the end of May 2000, the study is still in the early stage of development. The general tasks have been identified and personnel have been selected. The code to be assembled will be based upon two major existing software packages. The radiation transport simulation will be accomplished by updating the FLUKA Monte Carlo program, and the user interface will employ the ROOT software being developed at CERN. The end-product will be a Monte Carlo-based code which will complement the existing analytic codes such as BRYNTRN/HZETRN presently used by NASA to evaluate the effects of radiation shielding in space. The planned code will possess the ability to evaluate the radiation environment for spacecraft and habitats in Earth orbit, in interplanetary space, on the lunar surface, or on a planetary surface such as Mars. Furthermore, it will be usef...

  4. Simulation of Experimental Background using FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Rokni, Sayed

    1999-05-11

    In November 1997, Experiment T423 began acquiring data with the intentions of understanding the energy spectra of high-energy neutrons generated in the interaction of electrons with lead. The following describes a series of FLUKA simulations studying (1) particle yields in the absence of all background; (2) the background caused from scattering in the room; (3) the effects of the thick lead shielding which surrounded the detector; (4) the sources of neutron background created in this lead shielding; and (5) the ratio of the total background to the ideal yield. In each case, particular attention is paid to the neutron yield.

  5. An investigation of the neutron flux in bone-fluorine phantoms comparing accelerator based in vivo neutron activation analysis and FLUKA simulation data

    International Nuclear Information System (INIS)

    Mostafaei, F.; McNeill, F.E.; Chettle, D.R.; Matysiak, W.; Bhatia, C.; Prestwich, W.V.

    2015-01-01

    We have tested the Monte Carlo code FLUKA for its ability to assist in the development of a better system for the in vivo measurement of fluorine. We used it to create a neutron flux map of the inside of the in vivo neutron activation analysis irradiation cavity at the McMaster Accelerator Laboratory. The cavity is used in a system that has been developed for assessment of fluorine levels in the human hand. This study was undertaken to (i) assess the FLUKA code, (ii) find the optimal hand position inside the cavity and assess the effects on precision of a hand being in a non-optimal position and (iii) to determine the best location for our γ-ray detection system within the accelerator beam hall. Simulation estimates were performed using FLUKA. Experimental measurements of the neutron flux were performed using Mn wires. The activation of the wires was measured inside (1) an empty bottle, (2) a bottle containing water, (3) a bottle covered with cadmium and (4) a dry powder-based fluorine phantom. FLUKA was used to simulate the irradiation cavity, and used to estimate the neutron flux in different positions both inside, and external to, the cavity. The experimental results were found to be consistent with the Monte Carlo simulated neutron flux. Both experiment and simulation showed that there is an optimal position in the cavity, but that the effect on the thermal flux of a hand being in a non-optimal position is less than 20%, which will result in a less than 10% effect on the measurement precision. FLUKA appears to be a code that can be useful for modeling of this type of experimental system

  6. Energy deposition profile on ISOLDE Beam Dumps by FLUKA simulations

    CERN Document Server

    Vlachoudis, V

    2014-01-01

    In this report an estimation of the energy deposited on the current ISOLDE beam dumps obtained by means of FLUKA simulation code is presented. This is done for both ones GPS and HRS. Some estimations of temperature raise are given based on the assumption of adiabatic increase from energy deposited by the impinging protons. However, the results obtained here in relation to temperature are only a rough estimate. They are meant to be further studied through thermomechanical simulations using the energyprofiles hereby obtained.

  7. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Takashina, Masaaki; Koizumi, Masahiko [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P., E-mail: vadim.p.moskvin@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States)

    2014-10-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation.

  8. Optimization of GATE and PHITS Monte Carlo code parameters for uniform scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Takashina, Masaaki; Koizumi, Masahiko; Das, Indra J.; Moskvin, Vadim P.

    2014-01-01

    Although three general-purpose Monte Carlo (MC) simulation tools: Geant4, FLUKA and PHITS have been used extensively, differences in calculation results have been reported. The major causes are the implementation of the physical model, preset value of the ionization potential or definition of the maximum step size. In order to achieve artifact free MC simulation, an optimized parameters list for each simulation system is required. Several authors have already proposed the optimized lists, but those studies were performed with a simple system such as only a water phantom. Since particle beams have a transport, interaction and electromagnetic processes during beam delivery, establishment of an optimized parameters-list for whole beam delivery system is therefore of major importance. The purpose of this study was to determine the optimized parameters list for GATE and PHITS using proton treatment nozzle computational model. The simulation was performed with the broad scanning proton beam. The influences of the customizing parameters on the percentage depth dose (PDD) profile and the proton range were investigated by comparison with the result of FLUKA, and then the optimal parameters were determined. The PDD profile and the proton range obtained from our optimized parameters list showed different characteristics from the results obtained with simple system. This led to the conclusion that the physical model, particle transport mechanics and different geometry-based descriptions need accurate customization in planning computational experiments for artifact-free MC simulation

  9. Simulation of e-{gamma}-n targets by FLUKA and measurement of neutron flux at various angles for accelerator based neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T.; Pethe, S.N.; Krishnan, R. [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Bhoraskar, V.N. [Department of Physics, University of Pune, Pune 411 007 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India)

    2010-10-15

    A 6 MeV Race track Microtron (an electron accelerator) based pulsed neutron source has been designed specifically for the elemental analysis of short lived activation products where the low neutron flux requirement is desirable. The bremsstrahlung radiation emitted by impinging 6 MeV electron on the e-{gamma} primary target, was made to fall on the {gamma}-n secondary target to produce neutrons. The optimisation of bremsstrahlung and neutron producing target along with their spectra were estimated using FLUKA code. The measurement of neutron flux was carried out by activation of vanadium and the measured fluxes were 1.1878 x 10{sup 5}, 0.9403 x 10{sup 5}, 0.7428 x 10{sup 5}, 0.6274 x 10{sup 5}, 0.5659 x 10{sup 5}, 0.5210 x 10{sup 5} n/cm{sup 2}/s at 0{sup o}, 30{sup o}, 60{sup o}, 90{sup o}, 115{sup o}, 140{sup o} respectively. The results indicate that the neutron flux was found to be decreased as increase in the angle and in good agreement with the FLUKA simulation.

  10. A dedicated tool for PET scanner simulations using FLUKA

    International Nuclear Information System (INIS)

    Ortega, P.G.; Boehlen, T.T.; Cerutti, F.; Chin, M.P.W.; Ferrari, A.; Mancini, C.; Vlachoudis, V.; Mairani, A.; Sala, Paola R.

    2013-06-01

    Positron emission tomography (PET) is a well-established medical imaging technique. It is based on the detection of pairs of annihilation gamma rays from a beta+-emitting radionuclide, usually inoculated in the body via a biologically active molecule. Apart from its wide-spread use for clinical diagnosis, new applications are proposed. This includes notably the usage of PET for treatment monitoring of radiation therapy with protons and ions. PET is currently the only available technique for non-invasive monitoring of ion beam dose delivery, which was tested in several clinical pilot studies. For hadrontherapy, the distribution of positron emitters, produced by the ion beam, can be analyzed to verify the correct treatment delivery. The adaptation of previous PET scanners to new environments and the necessity of more precise diagnostics by better image quality triggered the development of new PET scanner designs. The use of Monte Carlo (MC) codes is essential in the early stages of the scanner design to simulate the transport of particles and nuclear interactions from therapeutic ion beams or radioisotopes and to predict radiation fields in tissues and radiation emerging from the patient. In particular, range verification using PET is based on the comparison of detected and simulated activity distributions. The accuracy of the MC code for the relevant physics processes is obviously essential for such applications. In this work we present new developments of the physics models with importance for PET monitoring and integrated tools for PET scanner simulations for FLUKA, a fully-integrated MC particle-transport code, which is widely used for an extended range of applications (accelerator shielding, detector and target design, calorimetry, activation, dosimetry, medical physics, radiobiology, ...). The developed tools include a PET scanner geometry builder and a dedicated scoring routine for coincident event determination. The geometry builder allows the efficient

  11. The FLUKA code: An accurate simulation tool for particle therapy

    CERN Document Server

    Battistoni, Giuseppe; Böhlen, Till T; Cerutti, Francesco; Chin, Mary Pik Wai; Dos Santos Augusto, Ricardo M; Ferrari, Alfredo; Garcia Ortega, Pablo; Kozlowska, Wioletta S; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically-based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in-vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with bot...

  12. Quantification of the validity of simulations based on Geant4 and FLUKA for photo-nuclear interactions in the high energy range

    Science.gov (United States)

    Quintieri, Lina; Pia, Maria Grazia; Augelli, Mauro; Saracco, Paolo; Capogni, Marco; Guarnieri, Guido

    2017-09-01

    Photo-nuclear interactions are relevant in many research fields of both fundamental and applied physics and, for this reason, accurate Monte Carlo simulations of photo-nuclear interactions can provide a valuable and indispensable support in a wide range of applications (i.e from the optimisation of photo-neutron source target to the dosimetric estimation in high energy accelerator, etc). Unfortunately, few experimental photo-nuclear data are available above 100 MeV, so that, in the high energy range (from hundreds of MeV up to GeV scale), the code predictions are based on physical models. The aim of this work is to compare the predictions of relevant observables involving photon-nuclear interaction modelling, obtained with GEANT4 and FLUKA, to experimental data (if available), in order to assess the code estimation reliability, over a wide energy range. In particular, the comparison of the estimated photo-neutron yields and energy spectra with the experimental results of the n@BTF experiment (carried out at the Beam Test Facility of DaΦne collider, in Frascati, Italy) is here reported and discussed. Moreover, the preliminary results of the comparison of the cross sections used in the codes with the"evaluated' data recommended by the IAEA are also presented for some selected cases (W, Pb, Zn).

  13. TU-AB-BRC-02: Accuracy Evaluation of GPU-Based OpenCL Carbon Monte Carlo Package (goCMC) in Biological Dose and Microdosimetry in Comparison to FLUKA Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Taleei, R; Peeler, C; Qin, N; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States)

    2016-06-15

    Purpose: One of the most accurate methods for radiation transport is Monte Carlo (MC) simulation. Long computation time prevents its wide applications in clinic. We have recently developed a fast MC code for carbon ion therapy called GPU-based OpenCL Carbon Monte Carlo (goCMC) and its accuracy in physical dose has been established. Since radiobiology is an indispensible aspect of carbon ion therapy, this study evaluates accuracy of goCMC in biological dose and microdosimetry by benchmarking it with FLUKA. Methods: We performed simulations of a carbon pencil beam with 150, 300 and 450 MeV/u in a homogeneous water phantom using goCMC and FLUKA. Dose and energy spectra for primary and secondary ions on the central beam axis were recorded. Repair-misrepair-fixation model was employed to calculate Relative Biological Effectiveness (RBE). Monte Carlo Damage Simulation (MCDS) tool was used to calculate microdosimetry parameters. Results: Physical dose differences on the central axis were <1.6% of the maximum value. Before the Bragg peak, differences in RBE and RBE-weighted dose were <2% and <1%. At the Bragg peak, the differences were 12.5% caused by small range discrepancy and sensitivity of RBE to beam spectra. Consequently, RBE-weighted dose difference was 11%. Beyond the peak, RBE differences were <20% and primarily caused by differences in the Helium-4 spectrum. However, the RBE-weighted dose agreed within 1% due to the low physical dose. Differences in microdosimetric quantities were small except at the Bragg peak. The simulation time per source particle with FLUKA was 0.08 sec, while goCMC was approximately 1000 times faster. Conclusion: Physical doses computed by FLUKA and goCMC were in good agreement. Although relatively large RBE differences were observed at and beyond the Bragg peak, the RBE-weighted dose differences were considered to be acceptable.

  14. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, Keita [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, Osaka University Graduate School of Medicine, Suita, Osaka 565-0871 (Japan); Department of Radiology, Osaka University Hospital, Suita, Osaka 565-0871 (Japan); Das, Indra J. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Moskvin, Vadim P. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN 46202 (United States); Department of Radiation Oncology, St. Jude Children’s Research Hospital, Memphis, TN 38105 (United States)

    2016-01-15

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm{sup 3}, which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm{sup 3} voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation

  15. SU-E-T-254: Optimization of GATE and PHITS Monte Carlo Code Parameters for Uniform Scanning Proton Beam Based On Simulation with FLUKA General-Purpose Code

    Energy Technology Data Exchange (ETDEWEB)

    Kurosu, K [Department of Radiation Oncology, Osaka University Graduate School of Medicine, Osaka (Japan); Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Takashina, M; Koizumi, M [Department of Medical Physics ' Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Das, I; Moskvin, V [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, IN (United States)

    2014-06-01

    Purpose: Monte Carlo codes are becoming important tools for proton beam dosimetry. However, the relationships between the customizing parameters and percentage depth dose (PDD) of GATE and PHITS codes have not been reported which are studied for PDD and proton range compared to the FLUKA code and the experimental data. Methods: The beam delivery system of the Indiana University Health Proton Therapy Center was modeled for the uniform scanning beam in FLUKA and transferred identically into GATE and PHITS. This computational model was built from the blue print and validated with the commissioning data. Three parameters evaluated are the maximum step size, cut off energy and physical and transport model. The dependence of the PDDs on the customizing parameters was compared with the published results of previous studies. Results: The optimal parameters for the simulation of the whole beam delivery system were defined by referring to the calculation results obtained with each parameter. Although the PDDs from FLUKA and the experimental data show a good agreement, those of GATE and PHITS obtained with our optimal parameters show a minor discrepancy. The measured proton range R90 was 269.37 mm, compared to the calculated range of 269.63 mm, 268.96 mm, and 270.85 mm with FLUKA, GATE and PHITS, respectively. Conclusion: We evaluated the dependence of the results for PDDs obtained with GATE and PHITS Monte Carlo generalpurpose codes on the customizing parameters by using the whole computational model of the treatment nozzle. The optimal parameters for the simulation were then defined by referring to the calculation results. The physical model, particle transport mechanics and the different geometrybased descriptions need accurate customization in three simulation codes to agree with experimental data for artifact-free Monte Carlo simulation. This study was supported by Grants-in Aid for Cancer Research (H22-3rd Term Cancer Control-General-043) from the Ministry of Health

  16. Optimization of GATE and PHITS Monte Carlo code parameters for spot scanning proton beam based on simulation with FLUKA general-purpose code

    International Nuclear Information System (INIS)

    Kurosu, Keita; Das, Indra J.; Moskvin, Vadim P.

    2016-01-01

    Spot scanning, owing to its superior dose-shaping capability, provides unsurpassed dose conformity, in particular for complex targets. However, the robustness of the delivered dose distribution and prescription has to be verified. Monte Carlo (MC) simulation has the potential to generate significant advantages for high-precise particle therapy, especially for medium containing inhomogeneities. However, the inherent choice of computational parameters in MC simulation codes of GATE, PHITS and FLUKA that is observed for uniform scanning proton beam needs to be evaluated. This means that the relationship between the effect of input parameters and the calculation results should be carefully scrutinized. The objective of this study was, therefore, to determine the optimal parameters for the spot scanning proton beam for both GATE and PHITS codes by using data from FLUKA simulation as a reference. The proton beam scanning system of the Indiana University Health Proton Therapy Center was modeled in FLUKA, and the geometry was subsequently and identically transferred to GATE and PHITS. Although the beam transport is managed by spot scanning system, the spot location is always set at the center of a water phantom of 600 × 600 × 300 mm 3 , which is placed after the treatment nozzle. The percentage depth dose (PDD) is computed along the central axis using 0.5 × 0.5 × 0.5 mm 3 voxels in the water phantom. The PDDs and the proton ranges obtained with several computational parameters are then compared to those of FLUKA, and optimal parameters are determined from the accuracy of the proton range, suppressed dose deviation, and computational time minimization. Our results indicate that the optimized parameters are different from those for uniform scanning, suggesting that the gold standard for setting computational parameters for any proton therapy application cannot be determined consistently since the impact of setting parameters depends on the proton irradiation technique

  17. TVF-NMCRC-A powerful program for writing and executing simulation inputs for the FLUKA Monte Carlo Code system

    International Nuclear Information System (INIS)

    Mark, S.; Khomchenko, S.; Shifrin, M.; Haviv, Y.; Schwartz, J.R.; Orion, I.

    2007-01-01

    We at the Negev Monte Carlo Research Center (NMCRC) have developed a powerful new interface for writing and executing FLUKA input files-TVF-NMCRC. With the TVF tool a FLUKA user has the ability to easily write an input file without requiring any previous experience. The TVF-NMCRC tool is a LINUX program that has been verified for the most common LINUX-based operating systems, and is suitable for the latest version of FLUKA (FLUKA 2006.3)

  18. Interactive fluka: a world wide web version for a simulation code in proton therapy

    International Nuclear Information System (INIS)

    Garelli, S.; Giordano, S.; Piemontese, G.; Squarcia, S.

    1998-01-01

    We considered the possibility of using the simulation code FLUKA, in the framework of TERA. We provided a window under World Wide Web in which an interactive version of the code is available. The user can find instructions for the installation, an on-line FLUKA manual and interactive windows for inserting all the data required by the configuration running file in a very simple way. The database choice allows a more versatile use for data verification and update, recall of old simulations and comparison with selected examples. A completely new tool for geometry drawing under Java has also been developed. (authors)

  19. FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors

    International Nuclear Information System (INIS)

    Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.

    2007-01-01

    One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies

  20. FLUKA Monte Carlo simulations and benchmark measurements for the LHC beam loss monitors

    Science.gov (United States)

    Sarchiapone, L.; Brugger, M.; Dehning, B.; Kramer, D.; Stockner, M.; Vlachoudis, V.

    2007-10-01

    One of the crucial elements in terms of machine protection for CERN's Large Hadron Collider (LHC) is its beam loss monitoring (BLM) system. On-line loss measurements must prevent the superconducting magnets from quenching and protect the machine components from damages due to unforeseen critical beam losses. In order to ensure the BLM's design quality, in the final design phase of the LHC detailed FLUKA Monte Carlo simulations were performed for the betatron collimation insertion. In addition, benchmark measurements were carried out with LHC type BLMs installed at the CERN-EU high-energy Reference Field facility (CERF). This paper presents results of FLUKA calculations performed for BLMs installed in the collimation region, compares the results of the CERF measurement with FLUKA simulations and evaluates related uncertainties. This, together with the fact that the CERF source spectra at the respective BLM locations are comparable with those at the LHC, allows assessing the sensitivity of the performed LHC design studies.

  1. Energy deposition profile for modification proposal of ISOLDE’s HRS Beam Dump, from FLUKA simulations

    CERN Document Server

    Vlachoudis, V

    2014-01-01

    The current ISOLDE HRS beam dump has been found to be unsuitable on previous simulations, due to thermomechanical stresses. In this paper a proposal for modifying HRS dump is studied using FLUKA. The energy deposited in this modified beam dump and the amount of neutrons streaming to the tunnel area are scored and compared with the simulation of current dump. Two versions of the modification have been assessed, determining which of them is more desirable in terms of influence of radiation on ISOLDE’s tunnel. Finally, a rough estimate of temperature raise in the modified dump is shown. Further conclusions on the adequacy of these modifications need to include the thermomechanical calculations’ results, based on those presented here.

  2. Using the FLUKA Monte Carlo Code to Simulate the Interactions of Ionizing Radiation with Matter to Assist and Aid Our Understanding of Ground Based Accelerator Testing, Space Hardware Design, and Secondary Space Radiation Environments

    Science.gov (United States)

    Reddell, Brandon

    2015-01-01

    Designing hardware to operate in the space radiation environment is a very difficult and costly activity. Ground based particle accelerators can be used to test for exposure to the radiation environment, one species at a time, however, the actual space environment cannot be duplicated because of the range of energies and isotropic nature of space radiation. The FLUKA Monte Carlo code is an integrated physics package based at CERN that has been under development for the last 40+ years and includes the most up-to-date fundamental physics theory and particle physics data. This work presents an overview of FLUKA and how it has been used in conjunction with ground based radiation testing for NASA and improve our understanding of secondary particle environments resulting from the interaction of space radiation with matter.

  3. Measurements and FLUKA simulations of bismuth and aluminium activation at the CERN Shielding Benchmark Facility (CSBF)

    Science.gov (United States)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.

    2018-03-01

    The CERN High Energy AcceleRator Mixed field facility (CHARM) is located in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5 ṡ1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7 ṡ1010 p/s that then impacts on the CHARM target. The shielding of the CHARM facility also includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target. This facility consists of 80 cm of cast iron and 360 cm of concrete with barite concrete in some places. Activation samples of bismuth and aluminium were placed in the CSBF and in the CHARM access corridor in July 2015. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields for these samples. The results estimated by FLUKA Monte Carlo simulations are compared to activation measurements of these samples. The comparison between FLUKA simulations and the measured values from γ-spectrometry gives an agreement better than a factor of 2.

  4. Technical Note: Defining cyclotron-based clinical scanning proton machines in a FLUKA Monte Carlo system.

    Science.gov (United States)

    Fiorini, Francesca; Schreuder, Niek; Van den Heuvel, Frank

    2018-02-01

    Cyclotron-based pencil beam scanning (PBS) proton machines represent nowadays the majority and most affordable choice for proton therapy facilities, however, their representation in Monte Carlo (MC) codes is more complex than passively scattered proton system- or synchrotron-based PBS machines. This is because degraders are used to decrease the energy from the cyclotron maximum energy to the desired energy, resulting in a unique spot size, divergence, and energy spread depending on the amount of degradation. This manuscript outlines a generalized methodology to characterize a cyclotron-based PBS machine in a general-purpose MC code. The code can then be used to generate clinically relevant plans starting from commercial TPS plans. The described beam is produced at the Provision Proton Therapy Center (Knoxville, TN, USA) using a cyclotron-based IBA Proteus Plus equipment. We characterized the Provision beam in the MC FLUKA using the experimental commissioning data. The code was then validated using experimental data in water phantoms for single pencil beams and larger irregular fields. Comparisons with RayStation TPS plans are also presented. Comparisons of experimental, simulated, and planned dose depositions in water plans show that same doses are calculated by both programs inside the target areas, while penumbrae differences are found at the field edges. These differences are lower for the MC, with a γ(3%-3 mm) index never below 95%. Extensive explanations on how MC codes can be adapted to simulate cyclotron-based scanning proton machines are given with the aim of using the MC as a TPS verification tool to check and improve clinical plans. For all the tested cases, we showed that dose differences with experimental data are lower for the MC than TPS, implying that the created FLUKA beam model is better able to describe the experimental beam. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists

  5. FLUKA simulations of a moderated reduced weight high energy neutron detection system

    Energy Technology Data Exchange (ETDEWEB)

    Biju, K., E-mail: bijusivolli@gmail.com [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India); Tripathy, S.P.; Sunil, C.; Sarkar, P.K. [Health Physics Division, Bhabha Atomic Research Centre, Mumbai 400085 (India)

    2012-08-01

    Neutron response of the systems containing high density polyethylene (HDPE) spheres coupled with different external metallic converters has been studied using the FLUKA Monte Carlo simulation code. A diameter of 17.8 cm (7 in.) of the moderating sphere is found to be optimum to obtain the maximum response when used with the neutron converter shells like W, Pb and Zr. Enhancement ratios of the neutron response due to the induced (n, xn) reactions in the outer converters made of W, Pb and Zr are analyzed. It is observed that the enhancement in the response by 1 cm thick Zr shell is comparable to that of 1 cm thick Pb in the energy region of 10-50 MeV. An appreciable enhancement is observed in the case of Zr converter for the higher energy neutrons. Thus, by reducing the dimension of the moderating sphere and using a Zr converter shell, the weight of the system reduces to 10 kg which is less compared to the presently available extended high energy neutron rem meters. The normalized energy dependent ambient dose equivalent response of the zirconium based rem counter (ZReC) at high energies is found to be in good agreement with the energy differential H{sup Low-Asterisk }(10) values suggested by the International Commission on Radiological Protection (ICRP). Based on this study, it is proposed that a rem meter made of 17.8 cm diameter HDPE sphere with 1 cm thick Zr can be used effectively and conveniently for routine monitoring in the accelerator environment.

  6. Simulation of ALTEA calibration data with PHITS, FLUKA and GEANT4

    International Nuclear Information System (INIS)

    La Tessa, C.; Di Fino, L.; Larosa, M.; Lee, K.; Mancusi, D.; Matthiae, D.; Narici, L.; Zaconte, V.

    2009-01-01

    The ALTEA-Space detector has been calibrated by testing its response to several monochromatic beams. These measurements provided energy-deposition spectra in silicon of 100, 600 and 1000 MeV/nucleon 12 C and 200 and 600 MeV/nucleon 48 Ti. The results have been compared to three Monte Carlo transport codes, namely PHITS, GEANT4 and FLUKA. Median, full width at half maximum (FWHM) and interquartile range (IQR) have been calculated for all datasets to characterize location, width and asymmetry of the energy-deposition spectra. Particular attention has been devoted to the influence of δ rays on the shape of the energy-deposition spectrum, both with the help of analytical calculations and Monte Carlo simulations. The two approaches proved that, in this range of projectile charge, projectile energy and detector size, the leakage of secondary electrons might introduce a difference between the energy-loss and energy-deposition spectrum, in particular by changing the location, width and symmetry of the distribution. The overall agreement between the Monte Carlo predictions and the measurements is fair and makes PHITS, FLUKA and GEANT4 all possible candidates for simulating ALTEA-Space experiment.

  7. Monte Carlo FLUKA code simulation for study of {sup 68}Ga production by direct proton-induced reaction

    Energy Technology Data Exchange (ETDEWEB)

    Mokhtari Oranj, Leila; Kakavand, Tayeb [Physics Faculty, Zanjan University, P.O. Box 451-313, Zanjan (Iran, Islamic Republic of); Sadeghi, Mahdi, E-mail: msadeghi@nrcam.org [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, P.O. Box 31485-498, Karaj (Iran, Islamic Republic of); Aboudzadeh Rovias, Mohammadreza [Agricultural, Medical and Industrial Research School, Nuclear Science and Technology Research Institute, P.O. Box 31485-498, Karaj (Iran, Islamic Republic of)

    2012-06-11

    {sup 68}Ga is an important radionuclide for positron emission tomography. {sup 68}Ga can be produced by the {sup 68}Zn(p,n){sup 68}Ga reaction in a common biomedical cyclotrons. To facilitate optimization of target design and study activation of materials, Monte Carlo code can be used to simulate the irradiation of the target materials with charged hadrons. In this paper, FLUKA code simulation was employed to prototype a Zn target for the production of {sup 68}Ga by proton irradiation. Furthermore, the experimental data were compared with the estimated values for the thick target yield produced in the irradiation time according to FLUKA code. In conclusion, FLUKA code can be used for estimation of the production yield.

  8. Assessment of the production of medical isotopes using the Monte Carlo code FLUKA: Simulations against experimental measurements

    Energy Technology Data Exchange (ETDEWEB)

    Infantino, Angelo, E-mail: angelo.infantino@unibo.it [Department of Industrial Engineering, Montecuccolino Laboratory, University of Bologna, Via dei Colli 16, 40136 Bologna (Italy); Oehlke, Elisabeth [TRIUMF, 4004 Wesbrook Mall, V6T 2A3 Vancouver, BC (Canada); Department of Radiation Science & Technology, Delft University of Technology, Postbus 5, 2600 AA Delft (Netherlands); Mostacci, Domiziano [Department of Industrial Engineering, Montecuccolino Laboratory, University of Bologna, Via dei Colli 16, 40136 Bologna (Italy); Schaffer, Paul; Trinczek, Michael; Hoehr, Cornelia [TRIUMF, 4004 Wesbrook Mall, V6T 2A3 Vancouver, BC (Canada)

    2016-01-01

    The Monte Carlo code FLUKA is used to simulate the production of a number of positron emitting radionuclides, {sup 18}F, {sup 13}N, {sup 94}Tc, {sup 44}Sc, {sup 68}Ga, {sup 86}Y, {sup 89}Zr, {sup 52}Mn, {sup 61}Cu and {sup 55}Co, on a small medical cyclotron with a proton beam energy of 13 MeV. Experimental data collected at the TR13 cyclotron at TRIUMF agree within a factor of 0.6 ± 0.4 with the directly simulated data, except for the production of {sup 55}Co, where the simulation underestimates the experiment by a factor of 3.4 ± 0.4. The experimental data also agree within a factor of 0.8 ± 0.6 with the convolution of simulated proton fluence and cross sections from literature. Overall, this confirms the applicability of FLUKA to simulate radionuclide production at 13 MeV proton beam energy.

  9. Monte Carlo simulation of secondary neutron dose for scanning proton therapy using FLUKA.

    Directory of Open Access Journals (Sweden)

    Chaeyeong Lee

    Full Text Available Proton therapy is a rapidly progressing field for cancer treatment. Globally, many proton therapy facilities are being commissioned or under construction. Secondary neutrons are an important issue during the commissioning process of a proton therapy facility. The purpose of this study is to model and validate scanning nozzles of proton therapy at Samsung Medical Center (SMC by Monte Carlo simulation for beam commissioning. After the commissioning, a secondary neutron ambient dose from proton scanning nozzle (Gantry 1 was simulated and measured. This simulation was performed to evaluate beam properties such as percent depth dose curve, Bragg peak, and distal fall-off, so that they could be verified with measured data. Using the validated beam nozzle, the secondary neutron ambient dose was simulated and then compared with the measured ambient dose from Gantry 1. We calculated secondary neutron dose at several different points. We demonstrated the validity modeling a proton scanning nozzle system to evaluate various parameters using FLUKA. The measured secondary neutron ambient dose showed a similar tendency with the simulation result. This work will increase the knowledge necessary for the development of radiation safety technology in medical particle accelerators.

  10. FLUKA and PENELOPE simulations of 10keV to 10MeV photons in LYSO and soft tissue

    CERN Document Server

    Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Sala, P R

    2014-01-01

    Monte Carlo simulations of electromagnetic particle interactions and transport by FLUKA and PENELOPE were compared. 10 key to 10 MeV incident photon beams impinged a LYSO crystal and a soft-tissue phantom. Central-axis as well as off-axis depth doses agreed within 1 s.d.; no systematic under- or overestimate of the pulse height spectra was observed from 100 keV to 10 MeV for both materials, agreement was within 5\\%. Simulation of photon and electron transport and interactions at this level of precision and reliability is of significant impact, for instance, on treatment monitoring of hadrontherapy where a code like FLUKA is needed to simulate the full suite of particles and interactions (not just electromagnetic). At the interaction-by-interaction level, apart from known differences in condensed history techniques, two-quanta positron annihilation at rest was found to differ between the two codes. PENELOPE produced a 511 key sharp line, whereas FLUKA produced visible acolinearity, a feature recently implemen...

  11. Estimation of neutron production from accelerator head assembly of 15 MV medical LINAC using FLUKA simulations

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T., E-mail: sharad@sameer.gov.in [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Pethe, S.N., E-mail: sanjay@sameer.gov.in [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Krishnan, R., E-mail: krishnan@sameer.gov.in [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Bhoraskar, V.N., E-mail: vnb@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ac.in [Department of Physics, University of Pune, Pune 411 007 (India)

    2011-12-15

    For the production of a clinical 15 MeV photon beam, the design of accelerator head assembly has been optimized using Monte Carlo based FLUKA code. The accelerator head assembly consists of e-{gamma} target, flattening filter, primary collimator and an adjustable rectangular secondary collimator. The accelerators used for radiation therapy generate continuous energy gamma rays called Bremsstrahlung (BR) by impinging high energy electrons on high Z materials. The electron accelerators operating above 10 MeV can result in the production of neutrons, mainly due to photo nuclear reaction ({gamma}, n) induced by high energy photons in the accelerator head materials. These neutrons contaminate the therapeutic beam and give a non-negligible contribution to patient dose. The gamma dose and neutron dose equivalent at the patient plane (SSD = 100 cm) were obtained at different field sizes of 0 Multiplication-Sign 0, 10 Multiplication-Sign 10, 20 Multiplication-Sign 20, 30 Multiplication-Sign 30 and 40 Multiplication-Sign 40 cm{sup 2}, respectively. The maximum neutron dose equivalent is observed near the central axis of 30 Multiplication-Sign 30 cm{sup 2} field size. This is 0.71% of the central axis photon dose rate of 0.34 Gy/min at 1 {mu}A electron beam current.

  12. Measurements and FLUKA Simulations of Bismuth, Aluminium and Indium Activation at the upgraded CERN Shielding Benchmark Facility (CSBF)

    Science.gov (United States)

    Iliopoulou, E.; Bamidis, P.; Brugger, M.; Froeschl, R.; Infantino, A.; Kajimoto, T.; Nakao, N.; Roesler, S.; Sanami, T.; Siountas, A.; Yashima, H.

    2018-06-01

    The CERN High energy AcceleRator Mixed field (CHARM) facility is situated in the CERN Proton Synchrotron (PS) East Experimental Area. The facility receives a pulsed proton beam from the CERN PS with a beam momentum of 24 GeV/c with 5·1011 protons per pulse with a pulse length of 350 ms and with a maximum average beam intensity of 6.7·1010 protons per second. The extracted proton beam impacts on a cylindrical copper target. The shielding of the CHARM facility includes the CERN Shielding Benchmark Facility (CSBF) situated laterally above the target that allows deep shielding penetration benchmark studies of various shielding materials. This facility has been significantly upgraded during the extended technical stop at the beginning of 2016. It consists now of 40 cm of cast iron shielding, a 200 cm long removable sample holder concrete block with 3 inserts for activation samples, a material test location that is used for the measurement of the attenuation length for different shielding materials as well as for sample activation at different thicknesses of the shielding materials. Activation samples of bismuth, aluminium and indium were placed in the CSBF in September 2016 to characterize the upgraded version of the CSBF. Monte Carlo simulations with the FLUKA code have been performed to estimate the specific production yields of bismuth isotopes (206 Bi, 205 Bi, 204 Bi, 203 Bi, 202 Bi, 201 Bi) from 209 Bi, 24 Na from 27 Al and 115 m I from 115 I for these samples. The production yields estimated by FLUKA Monte Carlo simulations are compared to the production yields obtained from γ-spectroscopy measurements of the samples taking the beam intensity profile into account. The agreement between FLUKA predictions and γ-spectroscopy measurements for the production yields is at a level of a factor of 2.

  13. R and D on automatic modeling methods for Monte Carlo codes FLUKA

    International Nuclear Information System (INIS)

    Wang Dianxi; Hu Liqin; Wang Guozhong; Zhao Zijia; Nie Fanzhi; Wu Yican; Long Pengcheng

    2013-01-01

    FLUKA is a fully integrated particle physics Monte Carlo simulation package. It is necessary to create the geometry models before calculation. However, it is time- consuming and error-prone to describe the geometry models manually. This study developed an automatic modeling method which could automatically convert computer-aided design (CAD) geometry models into FLUKA models. The conversion program was integrated into CAD/image-based automatic modeling program for nuclear and radiation transport simulation (MCAM). Its correctness has been demonstrated. (authors)

  14. Neutron spectrometry using LNL bonner spheres and FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Sarchiapone, L.; Zafiropoulos, D. [INFN, Laboratori Nazionali di Legnaro (Italy)

    2013-07-18

    The characterization of neutron fields has been made with a system based on a scintillation detector and multiple moderating spheres. The system, together with the unfolding procedure, have been tested in quasi-monochromatic neutron energy fields and in complex, mixed, cyclotron based environments. FLUKA simulations have been used to produce response functions and reference energy spectra.

  15. Simulation of equivalent dose due to accidental electron beam loss in Indus-1 and Indus-2 synchrotron radiation sources using FLUKA code

    International Nuclear Information System (INIS)

    Sahani, P.K.; Dev, Vipin; Singh, Gurnam; Haridas, G.; Thakkar, K.K.; Sarkar, P.K.; Sharma, D.N.

    2008-01-01

    Indus-1 and Indus-2 are two Synchrotron radiation sources at Raja Ramanna Centre for Advanced Technology (RRCAT), India. Stored electron energy in Indus-1 and Indus-2 are 450MeV and 2.5GeV respectively. During operation of storage ring, accidental electron beam loss may occur in addition to normal beam losses. The Bremsstrahlung radiation produced due to the beam losses creates a major radiation hazard in these high energy electron accelerators. FLUKA, the Monte Carlo radiation transport code is used to simulate the accidental beam loss. The simulation was carried out to estimate the equivalent dose likely to be received by a trapped person closer to the storage ring. Depth dose profile in water phantom for 450MeV and 2.5GeV electron beam is generated, from which percentage energy absorbed in 30cm water phantom (analogous to human body) is calculated. The simulation showed the percentage energy deposition in the phantom is about 19% for 450MeV electron and 4.3% for 2.5GeV electron. The dose build up factor in 30cm water phantom for 450MeV and 2.5GeV electron beam are found to be 1.85 and 2.94 respectively. Based on the depth dose profile, dose equivalent index of 0.026Sv and 1.08Sv are likely to be received by the trapped person near the storage ring in Indus-1 and Indus-2 respectively. (author)

  16. Use experience of FLUKA

    Energy Technology Data Exchange (ETDEWEB)

    Nakashima, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-03-01

    In order to conduct the shield design calculation of the Large Hadron Collider (LHC) under planning in CERN at present, the radiation group of CERN uses FLUKA (Monte Carlo High Energy Radiation Transport Code). Here is introduced on outline of FLUKA and use experience of FLUKA in the LHC-B detector shield design calculation in LHC plan. FLUKA can be said to be the highest standard in the high energy radiation transportation code of the world at every points of the physical model, the Monte Carlo calculation technique and the convenience at usage of the code. In Japan Atomic Energy Research Institute (JAERI), a using right of FLUKA for the target neutronics and facility shielding design at the neutron science research center is obtained and it seems to be an effective design means in these future designs. However, because FLUKA is allowed a limited opening and no own verification on the code, it will be supposed to be a large problem on investigating a validity in design. (K.G.)

  17. FLUKA shielding calculations for the FAIR project

    International Nuclear Information System (INIS)

    Fehrenbacher, Georg; Kozlova, Ekaterina; Radon, Torsten; Sokolov, Alexey

    2015-01-01

    FAIR is an international accelerator project being in construction at GSI Helmholtz center for heavy ion research in Darmstadt. The Monte Carlo program FLUKA is used to study radiation protection problems. The contribution deals with general application possibilities of FLUKA and for FAIR with respect the radiation protection planning. The necessity to simulate the radiation transport through shielding of several meters thickness and to determine the equivalent doses outside the shielding with sufficient accuracy is demonstrated using two examples under consideration of the variance reduction. Results of simulation calculations for activation estimation in accelerator facilities are presented.

  18. A Simple Ripple Filter for FLUKA

    DEFF Research Database (Denmark)

    Bassler, Niels; Herrmann, Rochus

    In heavy ion radiotherapy, pristine C-12 beams are usually widened a few mm (FWHM) along the beam axis, before the actual spread out Bragg peak (SOBP) is build. The pristine beam widening is commonly performed with a ripple filter, known from the facility at GSI (Darmstadt) and at HIT (Heidelberg......). The ripple filter at GSI and HIT consists of several wedge like structures, which widens the Bragg-peak up to e.g. 3 mm. For Monte Carlo simulations of C-12 therapy, the exact setup, including the ripple filter needs to be simulated. In the Monte Carlo particle transport program FLUKA, the ripple filter can....... Since the ripple filter is a periodic geometry, one could use the LATTIC card with advantage, but here we shall take a Monte Carlo based approach istead. The advantage of this method is that our input file merely contains one body as the ripple filter, which can be a flat slab (or any other arbitrary...

  19. Shielding calculations using FLUKA

    International Nuclear Information System (INIS)

    Yamaguchi, Chiri; Tesch, K.; Dinter, H.

    1988-06-01

    The dose equivalent on the surface of concrete shielding has been calculated using the Monte Carlo code FLUKA86 for incident proton energies from 10 to 800 GeV. The results have been compared with some simple equations. The value of the angular dependent parameter in Moyer's equation has been calculated from the locations where the values of the maximum dose equivalent occur. (author)

  20. Simulation of electron, positron and Bremsstrahlung spectrum generated due to electromagnetic cascade by 2.5 GeV electron hitting lead target using FLUKA code

    International Nuclear Information System (INIS)

    Sahani, P.K.; Dev, Vipin; Haridas, G.; Thakkar, K.K.; Singh, Gurnam; Sarkar, P.K.; Sharma, D.N.

    2009-01-01

    INDUS-2 is a high energy electron accelerator facility where electrons are accelerated in circular ring up to maximum energy 2.5 GeV, to generate synchrotron radiation. During normal operation of the machine a fraction of these electrons is lost, which interact with the accelerator structures and components like vacuum chamber and residual gases in the cavity and hence generates significant amount of Bremsstrahlung radiation. The Bremsstrahlung radiation is highly dependent on the incident electron energy, target material and its thickness. The Bremsstrahlung radiation dominates the radiation environment in such electron storage rings. Because of its broad spectrum extending up to incident electron energy and pulsed nature, it is very difficult to segregate the Bremsstrahlung component from the mixed field environment in accelerators. With the help of FLUKA Monte Carlo code, Bremsstrahlung spectrum generated from 2.5 GeV electron on bombardment of high Z lead target is simulated. To study the variation in Bremsstrahlung spectrum on target thickness, lead targets of 3, 6, 9, 12, 15, 18 mm thickness was used. The energy spectrum of emerging electron and positron is also simulated. The study suggests that as the target thickness increases, the emergent Bremsstrahlung photon fluence increases. With increase in the target thickness Bremsstrahlung photons in the spectrum dominate the low energy part and degrade in high energy part. The electron and positron spectra also extend up to incident electron energy. (author)

  1. CERN Technical Training 2008: Learning for the LHC! FLUKA Workshop 2008: 23-27 June 2008

    CERN Multimedia

    2008-01-01

    http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications, can be found on the official FLUKA website (http://www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginners course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics of FLUKA, as well as offering the possibility to broaden their knowledge about t...

  2. CERN Technical Training 2008: Learning for the LHC! FLUKA Workshop 2008: 23-27 June 2008

    CERN Multimedia

    2008-01-01

    http://www.cern.ch/Fluka2008 FLUKA is a fully integrated particle physics Monte-Carlo simulation package. It has many applications in high energy experimental physics and engineering, shielding, detector and telescope design, cosmic ray studies, dosimetry, medical physics and radio-biology. More information, as well as related publications, can be found on the official FLUKA website (http://www.fluka.org). This year, the CERN FLUKA Team, in collaboration with INFN and SC/RP, is organizing a FLUKA beginner’s course, held at CERN for the first time. Previous one-week courses were given in Frascati (Italy), twice in Houston (Texas, US), Pavia (Italy), as well as in Legnaro (Italy). At CERN, continuous lectures are provided in the framework of locally scheduled ‘FLUKA User Meetings’ (http://www.cern.ch/info-fluka-discussion). This new dedicated one-week CERN training course will be an opportunity for new users to learn the basics of FLUKA, as well as offering the possibility to broaden their knowledge abou...

  3. Flukacad/Pipsicad: three-dimensional interfaces between Fluka and Autocad

    International Nuclear Information System (INIS)

    Helmut Vincke

    2001-01-01

    FLUKA is a widely used 3-D particle transport program. Up to now there was no possibility to display the simulation geometry or the calculated tracks in three dimensions. Even with FLUKA there exists only an option to picture two-dimensional views through the geometry used. This paper covers the description of two interface programs between the particle transport code FLUKA and the CAD program AutoCAD. These programs provide a three-dimensional facility not only for illustrating the simulated FLUKA geometry (FLUKACAD), but also for picturing simulated particle tracks (PIPSICAD) in a three-dimensional set-up. Additionally, the programming strategy for connecting FLUKA with AutoCAD is shown. A number of useful features of the programs themselves, but also of AutoCAD in the context of FLUKACAD and PIPSICAD, are explained. (authors)

  4. SU-E-T-590: Optimizing Magnetic Field Strengths with Matlab for An Ion-Optic System in Particle Therapy Consisting of Two Quadrupole Magnets for Subsequent Simulations with the Monte-Carlo Code FLUKA

    International Nuclear Information System (INIS)

    Baumann, K; Weber, U; Simeonov, Y; Zink, K

    2015-01-01

    Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular and thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system

  5. TU-EF-304-10: Efficient Multiscale Simulation of the Proton Relative Biological Effectiveness (RBE) for DNA Double Strand Break (DSB) Induction and Bio-Effective Dose in the FLUKA Monte Carlo Radiation Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Moskvin, V; Tsiamas, P; Axente, M; Farr, J [St. Jude Children’s Research Hospital, Memphis, TN (United States); Stewart, R [University of Washington, Seattle, WA. (United States)

    2015-06-15

    Purpose: One of the more critical initiating events for reproductive cell death is the creation of a DNA double strand break (DSB). In this study, we present a computationally efficient way to determine spatial variations in the relative biological effectiveness (RBE) of proton therapy beams within the FLUKA Monte Carlo (MC) code. Methods: We used the independently tested Monte Carlo Damage Simulation (MCDS) developed by Stewart and colleagues (Radiat. Res. 176, 587–602 2011) to estimate the RBE for DSB induction of monoenergetic protons, tritium, deuterium, hellium-3, hellium-4 ions and delta-electrons. The dose-weighted (RBE) coefficients were incorporated into FLUKA to determine the equivalent {sup 6}°60Co γ-ray dose for representative proton beams incident on cells in an aerobic and anoxic environment. Results: We found that the proton beam RBE for DSB induction at the tip of the Bragg peak, including primary and secondary particles, is close to 1.2. Furthermore, the RBE increases laterally to the beam axis at the area of Bragg peak. At the distal edge, the RBE is in the range from 1.3–1.4 for cells irradiated under aerobic conditions and may be as large as 1.5–1.8 for cells irradiated under anoxic conditions. Across the plateau region, the recorded RBE for DSB induction is 1.02 for aerobic cells and 1.05 for cells irradiated under anoxic conditions. The contribution to total effective dose from secondary heavy ions decreases with depth and is higher at shallow depths (e.g., at the surface of the skin). Conclusion: Multiscale simulation of the RBE for DSB induction provides useful insights into spatial variations in proton RBE within pristine Bragg peaks. This methodology is potentially useful for the biological optimization of proton therapy for the treatment of cancer. The study highlights the need to incorporate spatial variations in proton RBE into proton therapy treatment plans.

  6. FLUKA: A Multi-Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan; Fasso, A.; /SLAC; Ranft, J.; /Siegen U.

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  7. FLUKA-LIVE-an embedded framework, for enabling a computer to execute FLUKA under the control of a Linux OS

    International Nuclear Information System (INIS)

    Cohen, A.; Battistoni, G.; Mark, S.

    2008-01-01

    This paper describes a Linux-based OS framework for integrating the FLUKA Monte Carlo software (currently distributed only for Linux) into a CD-ROM, resulting in a complete environment for a scientist to edit, link and run FLUKA routines-without the need to install a UNIX/Linux operating system. The building process includes generating from scratch a complete operating system distribution which will, when operative, build all necessary components for successful operation of FLUKA software and libraries. Various source packages, as well as the latest kernel sources, are freely available from the Internet. These sources are used to create a functioning Linux system that integrates several core utilities in line with the main idea-enabling FLUKA to act as if it was running under a popular Linux distribution or even a proprietary UNIX workstation. On boot-up a file system will be created and the contents from the CD will be uncompressed and completely loaded into RAM-after which the presence of the CD is no longer necessary, and could be removed for use on a second computer. The system can operate on any i386 PC as long as it can boot from a CD

  8. Use of the FLUKA Monte Carlo code for 3D patient-specific dosimetry on PET-CT and SPECT-CT images*

    Science.gov (United States)

    Botta, F; Mairani, A; Hobbs, R F; Vergara Gil, A; Pacilio, M; Parodi, K; Cremonesi, M; Coca Pérez, M A; Di Dia, A; Ferrari, M; Guerriero, F; Battistoni, G; Pedroli, G; Paganelli, G; Torres Aroche, L A; Sgouros, G

    2014-01-01

    Patient-specific absorbed dose calculation for nuclear medicine therapy is a topic of increasing interest. 3D dosimetry at the voxel level is one of the major improvements for the development of more accurate calculation techniques, as compared to the standard dosimetry at the organ level. This study aims to use the FLUKA Monte Carlo code to perform patient-specific 3D dosimetry through direct Monte Carlo simulation on PET-CT and SPECT-CT images. To this aim, dedicated routines were developed in the FLUKA environment. Two sets of simulations were performed on model and phantom images. Firstly, the correct handling of PET and SPECT images was tested under the assumption of homogeneous water medium by comparing FLUKA results with those obtained with the voxel kernel convolution method and with other Monte Carlo-based tools developed to the same purpose (the EGS-based 3D-RD software and the MCNP5-based MCID). Afterwards, the correct integration of the PET/SPECT and CT information was tested, performing direct simulations on PET/CT images for both homogeneous (water) and non-homogeneous (water with air, lung and bone inserts) phantoms. Comparison was performed with the other Monte Carlo tools performing direct simulation as well. The absorbed dose maps were compared at the voxel level. In the case of homogeneous water, by simulating 108 primary particles a 2% average difference with respect to the kernel convolution method was achieved; such difference was lower than the statistical uncertainty affecting the FLUKA results. The agreement with the other tools was within 3–4%, partially ascribable to the differences among the simulation algorithms. Including the CT-based density map, the average difference was always within 4% irrespective of the medium (water, air, bone), except for a maximum 6% value when comparing FLUKA and 3D-RD in air. The results confirmed that the routines were properly developed, opening the way for the use of FLUKA for patient-specific, image-based

  9. Minimizing the background radiation in the new neutron time-of-flight facility at CERN FLUKA Monte Carlo simulations for the optimization of the n_TOF second experimental line

    CERN Document Server

    Bergström, Ida; Elfgren, Erik

    2013-06-11

    At the particle physics laboratory CERN in Geneva, Switzerland, the Neutron Time-of-Flight facility has recently started the construction of a second experimental line. The new neutron beam line will unavoidably induce radiation in both the experimental area and in nearby accessible areas. Computer simulations for the minimization of the background were carried out using the FLUKA Monte Carlo simulation package. The background radiation in the new experimental area needs to be kept to a minimum during measurements. This was studied with focus on the contributions from backscattering in the beam dump. The beam dump was originally designed for shielding the outside area using a block of iron covered in concrete. However, the backscattering was never studied in detail. In this thesis, the fluences (i.e. the flux integrated over time) of neutrons and photons were studied in the experimental area while the beam dump design was modified. An optimized design was obtained by stopping the fast neutrons in a high Z mat...

  10. Modelling plastic scintillator response to gamma rays using light transport incorporated FLUKA code

    Energy Technology Data Exchange (ETDEWEB)

    Ranjbar Kohan, M. [Physics Department, Tafresh University, Tafresh (Iran, Islamic Republic of); Etaati, G.R. [Department of Nuclear Engineering and Physics, Amir Kabir University of Technology, Tehran (Iran, Islamic Republic of); Ghal-Eh, N., E-mail: ghal-eh@du.ac.ir [School of Physics, Damghan University, Damghan (Iran, Islamic Republic of); Safari, M.J. [Department of Energy Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of); Afarideh, H. [Department of Nuclear Engineering and Physics, Amir Kabir University of Technology, Tehran (Iran, Islamic Republic of); Asadi, E. [Department of Physics, Payam-e-Noor University, Tehran (Iran, Islamic Republic of)

    2012-05-15

    The response function of NE102 plastic scintillator to gamma rays has been simulated using a joint FLUKA+PHOTRACK Monte Carlo code. The multi-purpose particle transport code, FLUKA, has been responsible for gamma transport whilst the light transport code, PHOTRACK, has simulated the transport of scintillation photons through scintillator and lightguide. The simulation results of plastic scintillator with/without light guides of different surface coverings have been successfully verified with experiments. - Highlights: Black-Right-Pointing-Pointer A multi-purpose code (FLUKA) and a light transport code (PHOTRACK) have been linked. Black-Right-Pointing-Pointer The hybrid code has been used to generate the response function of an NE102 scintillator. Black-Right-Pointing-Pointer The simulated response functions exhibit a good agreement with experimental data.

  11. Testing FLUKA on neutron activation of Si and Ge at nuclear research reactor using gamma spectroscopy

    Science.gov (United States)

    Bazo, J.; Rojas, J. M.; Best, S.; Bruna, R.; Endress, E.; Mendoza, P.; Poma, V.; Gago, A. M.

    2018-03-01

    Samples of two characteristic semiconductor sensor materials, silicon and germanium, have been irradiated with neutrons produced at the RP-10 Nuclear Research Reactor at 4.5 MW. Their radionuclides photon spectra have been measured with high resolution gamma spectroscopy, quantifying four radioisotopes (28Al, 29Al for Si and 75Ge and 77Ge for Ge). We have compared the radionuclides production and their emission spectrum data with Monte Carlo simulation results from FLUKA. Thus we have tested FLUKA's low energy neutron library (ENDF/B-VIIR) and decay photon scoring with respect to the activation of these semiconductors. We conclude that FLUKA is capable of predicting relative photon peak amplitudes, with gamma intensities greater than 1%, of produced radionuclides with an average uncertainty of 13%. This work allows us to estimate the corresponding systematic error on neutron activation simulation studies of these sensor materials.

  12. Experiments and FLUKA simulations of $^{12}C$ and $^{16}O$ beams for therapy monitoring by means of in-beam Positron Emission Tomography

    CERN Document Server

    Sommerer,; Ferrari, A

    2007-01-01

    Since 1997 at the experimental C-12 ion therapy facility at Gesellschaft fuer Schwerionenforschung (GSI), Darmstadt, Germany, more than 350 patients have been treated. The therapy is monitored with a dedicated positron emission tomograph, fully integrated into the treatment site. The measured beta+-activity arises from inelastic nuclear interactions between the beam particles an the nuclei of the patients tissue. Because the monitoring is done during the irradiation the method is called in-beam PET. The underlying principle of this monitoring is a comparison between the measured activity and a simulated one. The simulations are presently done by the PETSIM code which is dedicated to C-12 beams. In future ion therapy centers like the Heidelberger Ionenstrahl Therapiezentrum (HIT), Heidelberg, Germany, besides C-12 also proton, $^3$He and O-16 beams will be used for treatment and the therapy will be monitored by means of in-beam PET. Because PETSIM is not extendable to other ions in an easy way, a code capable ...

  13. FLUKA Calculation of the Neutron Albedo Encountered at Low Earth Orbits

    CERN Document Server

    Claret, Arnaud; Combier, Natacha; Ferrari, Alfredo; Laurent, Philippe

    2014-01-01

    This paper presents Monte-Carlo simulations based on the Fluka code aiming to calculate the contribution of the neutron albedo at a given date and altitude above the Earth chosen by the user. The main input parameters of our model are the solar modulation affecting the spectra of cosmic rays, and the date of the Earth’s geomagnetic fi eld. The results consist in a two-parameter distribution, the neutron energy and the angle to the tangent plane of the sphere containing the orbi t of interest, and are provided by geographical position above the E arth at the chosen altitude. This model can be used to predict the te mporal variation of the neutron fl ux encountered along the orbit, and thus constrain the determination of the instrumental backg round noise of space experiments in low earth orbit.

  14. FLUKA studies of hadron-irradiated scintillating crystals for calorimetry at the High-Luminosity LHC

    CERN Document Server

    Quittnat, Milena Eleonore

    2015-01-01

    Calorimetry at the High-Luminosity LHC (HL-LHC) will be performed in a harsh radiation environment with high hadron fluences. The upgraded CMS electromagnetic calorimeter design and suitable scintillating materials are a focus of current research. In this paper, first results using the Monte Carlo simulation program FLUKA are compared to measurements performed with proton-irradiated LYSO, YSO and cerium fluoride crystals. Based on these results, an extrapolation to the behavior of an electromagnetic sampling calorimeter, using one of the inorganic scintillators above as an active medium, is performed for the upgraded CMS experiment at the HL-LHC. Characteristic parameters such as the induced ambient dose, fluence spectra for different particle types and the residual nuclei are studied, and the suitability of these materials for a future calorimeter is surveyed. Particular attention is given to the creation of isotopes in an LYSO-tungsten calorimeter that might contribute a prohibitive background to the measu...

  15. The FLUKA atmospheric neutrino flux calculation

    CERN Document Server

    Battistoni, G.; Montaruli, T.; Sala, P.R.

    2003-01-01

    The 3-dimensional (3-D) calculation of the atmospheric neutrino flux by means of the FLUKA Monte Carlo model is here described in all details, starting from the latest data on primary cosmic ray spectra. The importance of a 3-D calculation and of its consequences have been already debated in a previous paper. Here instead the focus is on the absolute flux. We stress the relevant aspects of the hadronic interaction model of FLUKA in the atmospheric neutrino flux calculation. This model is constructed and maintained so to provide a high degree of accuracy in the description of particle production. The accuracy achieved in the comparison with data from accelerators and cross checked with data on particle production in atmosphere certifies the reliability of shower calculation in atmosphere. The results presented here can be already used for analysis by current experiments on atmospheric neutrinos. However they represent an intermediate step towards a final release, since this calculation does not yet include the...

  16. Nuclear model developments in FLUKA for present and future applications

    Science.gov (United States)

    Cerutti, Francesco; Empl, Anton; Fedynitch, Anatoli; Ferrari, Alfredo; Ruben, GarciaAlia; Sala, Paola R.; Smirnov, George; Vlachoudis, Vasilis

    2017-09-01

    The FLUKAS code [1-3] is used in research laboratories all around the world for challenging applications spanning a very wide range of energies, projectiles and targets. FLUKAS is also extensively used for in hadrontherapy research studies and clinical planning systems. In this paper some of the recent developments in the FLUKAS nuclear physics models of relevance for very different application fields including medical physics are presented. A few examples are shown demonstrating the effectiveness of the upgraded code.

  17. East Area Irradiation Test Facility: Preliminary FLUKA calculations

    CERN Document Server

    Lebbos, E; Calviani, M; Gatignon, L; Glaser, M; Moll, M; CERN. Geneva. ATS Department

    2011-01-01

    In the framework of the Radiation to Electronics (R2E) mitigation project, the testing of electronic equipment in a radiation field similar to the one occurring in the LHC tunnel and shielded areas to study its sensitivity to single even upsets (SEU) is one of the main topics. Adequate irradiation test facilities are therefore required, and one installation is under consideration in the framework of the PS East area renovation activity. FLUKA Monte Carlo calculations were performed in order to estimate the radiation field which could be obtained in a mixed field facility using the slowly extracted 24 GeV/c proton beam from the PS. The prompt ambient dose equivalent as well as the equivalent residual dose rate after operation was also studied and results of simulations are presented in this report.

  18. FLUKA A multi-particle transport code (program version 2005)

    CERN Document Server

    Ferrari, A; Fassò, A; Ranft, Johannes

    2005-01-01

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner’s guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  19. FLUKA Monte Carlo Modelling of the CHARM Facility’s Test Area: Update of the Radiation Field Assessment

    CERN Document Server

    Infantino, Angelo

    2017-01-01

    The present Accelerator Note is a follow-up of the previous report CERN-ACC-NOTE-2016-12345. In the present work, the FLUKA Monte Carlo model of CERN’s CHARM facility has been improved to the most up-to-date configuration of the facility, including: new test positions, a global refinement of the FLUKA geometry, a careful review of the transport and physics parameters. Several configurations of the facility, in terms of target material and movable shielding configuration, have been simulated. The full set of results is reported in the following and can act as a reference guide to any potential user of the facility.

  20. Inter-comparison of MARS and FLUKA: Predictions on Energy Deposition in LHC IR Quadrupoles

    CERN Document Server

    Hoa, C; Cerutti, F; Ferrai, A

    2008-01-01

    Detailed modellings of the LHC insertion regions (IR) have earlier been performed to evaluate energy deposition in the IR superconducting magnets [1-4]. Proton-proton collisions at 14 TeV in the centre of mass lead to debris, depositing energy in the IR components. To evaluate uncertainties in those simulations and gain further confidence in the tools and approaches used, inter-comparison calculations have been performed with the latest versions of the FLUKA (2006.3b) [5, 6] and MARS15 [7, 8] Monte Carlo codes. These two codes, used worldwide for multi particle interaction and transport in accelerator, detector and shielding components, have been thoroughly benchmarked by the code authors and the user community (see, for example, recent [9, 10]). In the study described below, a better than 5% agreement was obtained for energy deposition calculated with these two codes - based on different independent physics models - for the identical geometry and initial conditions of a simple model representing the IR5 and ...

  1. Inter-comparison of MARS and FLUKA: Predictions on energy deposition in LHC IR quadrupoles

    International Nuclear Information System (INIS)

    Hoa, Christine; Cerutti, F.; Ferrari, A.; Mokhov, N.V.

    2008-01-01

    Detailed modelings of the LHC insertion regions (IR) have earlier been performed to evaluate energy deposition in the IR superconducting magnets [1-4]. Proton-proton collisions at 14 TeV in the centre of mass lead to debris, depositing energy in the IR components. To evaluate uncertainties in those simulations and gain further confidence in the tools and approaches used, inter-comparison calculations have been performed with the latest versions of the FLUKA (2006.3b) [5, 6] and MARS15 [7, 8] Monte Carlo codes. These two codes, used worldwide for multi particle interaction and transport in accelerator, detector and shielding components, have been thoroughly benchmarked by the code authors and the user community (see, for example, recent [9, 10]). In the study described below, a better than 5% agreement was obtained for energy deposition calculated with these two codes--based on different independent physics models--for the identical geometry and initial conditions of a simple model representing the IR5 and its first quadrupole

  2. Flair: A powerful but user friendly graphical interface for FLUKA

    International Nuclear Information System (INIS)

    Vlachoudis, V.

    2009-01-01

    FLAIR is an advanced user graphical interface for FLUKA, to enable the user to start and control FLUKA jobs completely from a GUI environment without the need for command-line interactions. It is written entirely with python and Tkinter allowing easier portability across various operating systems and great programming flexibility with focus to be used as an Application Programming Interface (API) for FLUKA. FLAIR is an integrated development environment (IDE) for FLUKA, it does not only provide means for the post processing of the output but a big emphasis has been set on the creation and checking of error free input files. It contains a fully featured editor for editing the input files in a human readable way with syntax highlighting, without hiding the inner functionality of FLUKA from the users. It provides also means for building the executable, debugging the geometry, running the code, monitoring the status of one or many runs, inspection of the output files, post processing of the binary files (data merging) and interface to plotting utilities like gnuplot and PovRay for high quality plots or photo-realistic images. The program includes also a database of selected properties of all known nuclides and their known isotopic composition as well a reference database of ∼ 300 predefined materials together with their Sterheimer parameters. (authors)

  3. A new calculation of atmospheric neutrino flux: the FLUKA approach

    International Nuclear Information System (INIS)

    Battistoni, G.; Bloise, C.; Cavalli, D.; Ferrari, A.; Montaruli, T.; Rancati, T.; Resconi, S.; Ronga, F.; Sala, P.R.

    1999-01-01

    Preliminary results from a full 3-D calculation of atmospheric neutrino fluxes using the FLUKA interaction model are presented and compared to previous existing calculations. This effort is motivated mainly by the 3-D capability and the satisfactory degree of accuracy of the hadron-nucleus models embedded in the FLUKA code. Here we show examples of benchmarking tests of the model with cosmic ray experiment results. A comparison of our calculation of the atmospheric neutrino flux with that of the Bartol group, for E ν > 1 GeV, is presented

  4. The FLUKA study of the secondary particles fluence in the AD-Antiproton Decelerator target area

    CERN Document Server

    Calviani, M

    2014-01-01

    In this paper we present Monte Carlo FLUKA simulations [1, 2] carried out to investigate the secondary particles fluence emerging from the antiproton production target and their spatial distribution in the AD target area. The detailed quantitative analysis has been performed for different positions along the magnet dog-leg as well as after the main collimator. These results allow tuning the position of the new beam current transformers (BCT) in the target area, in order to have a precise pulse-by-pulse evaluation of the intensity of negative particles injected in the AD-ring before the deceleration phase.

  5. Fluka and thermo-mechanical studies for the CLIC main dump

    CERN Document Server

    Mereghetti, Alessio; Vlachoudis, Vasilis

    2011-01-01

    In order to best cope with the challenge of absorbing the multi-MW beam, a water beam dump at the end of the CLIC post-collision line has been proposed. The design of the dump for the Conceptual Design Report (CDR) was checked against with a set of FLUKA Monte Carlo simulations, for the estimation of the peak and total power absorbed by the water and the vessel. Fluence spectra of escaping particles and activation rates of radio-nuclides were computed as well. Finally, the thermal transient behavior of the water bath and a thermo-mechanical analysis of the preliminary design of the window were done.

  6. Calculation of electron and isotopes dose point kernels with FLUKA Monte Carlo code for dosimetry in nuclear medicine therapy.

    Science.gov (United States)

    Botta, F; Mairani, A; Battistoni, G; Cremonesi, M; Di Dia, A; Fassò, A; Ferrari, A; Ferrari, M; Paganelli, G; Pedroli, G; Valente, M

    2011-07-01

    The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, FLUKA Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, FLUKA has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. FLUKA DPKS have been calculated in both water and compact bone for monoenergetic electrons (10-3 MeV) and for beta emitting isotopes commonly used for therapy (89Sr, 90Y, 131I 153Sm, 177Lu, 186Re, and 188Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. FLUKA outcomes have been compared to PENELOPE v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (ETRAN, GEANT4, MCNPX) has been done. Maximum percentage differences within 0.8.RCSDA and 0.9.RCSDA for monoenergetic electrons (RCSDA being the continuous slowing down approximation range) and within 0.8.X90 and 0.9.X90 for isotopes (X90 being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9.RCSDA and 0.9.X90 for electrons and isotopes, respectively. Concerning monoenergetic electrons, within 0.8.RCSDA (where 90%-97% of the particle energy is deposed), FLUKA and PENELOPE agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The

  7. An integral test of FLUKA nuclear models with 160 MeV proton beams in multi-layer Faraday cups

    International Nuclear Information System (INIS)

    Rinaldi, I; Ferrari, A; Mairani, A; Parodi, K; Paganetti, H; Sala, P

    2011-01-01

    Monte Carlo (MC) codes are useful tools to simulate the complex processes of proton beam interactions with matter. In proton therapy, nuclear reactions influence the dose distribution. Therefore, the validation of nuclear models adopted in MC codes is a critical requisite for their use in this field. A simple integral test can be performed using a multi-layer Faraday cup (MLFC). This method allows separation of the nuclear and atomic interaction processes, which are responsible for secondary particle emission and the finite primary proton range, respectively. In this work, the propagation of 160 MeV protons stopping in two MLFCs made of polyethylene and copper has been simulated by the FLUKA MC code. The calculations have been performed with and without secondary electron emission and transport, as well as charge sharing in the dielectric layers. Previous results with other codes neglected those two effects. The impact of this approximation has been investigated and found to be relevant only in the proximity of the Bragg peak. Longitudinal charge distributions computed with FLUKA with both approaches have been compared with experimental data from the literature. Moreover, the contribution of different processes to the measurable signal has been addressed. A thorough analysis of the results has demonstrated that the nuclear and electromagnetic models of FLUKA reproduce the two sets of experimental data reasonably well.

  8. The application of the Monte Carlo code FLUKA in radiation protection studies for the large hadron collider

    International Nuclear Information System (INIS)

    Battistoni, Giuseppe; Broggi, Francesco; Brugger, Markus

    2010-01-01

    The multi-purpose particle interaction and transport code FLUKA is integral part of all radiation protection studies for the design and operation of the Large Hadron Collider (LHC) at CERN. It is one of the very few codes available for this type of calculations which is capable to calculate in one and the same simulation proton-proton and heavy ion collisions at LHC energies as well as the entire hadronic and electromagnetic particle cascade initiated by secondary particles in detectors and beam-line components from TeV energies down to energies of thermal neutrons. The present paper reviews these capabilities of FLUKA in giving details of relevant physics models along with examples of radiation protection studies for the LHC such as shielding studies for underground areas occupied by personnel during LHC operation and the simulation of induced radioactivity around beam loss points. Integral part of the FLUKA development is a careful benchmarking of specific models as well as the code performance in complex, real life applications which is demonstrated with examples of studies relevant to radiation protection at the LHC. (author)

  9. Fluka studies of the Asynchronous Beam Dump Effects on LHC Point 6

    CERN Document Server

    Versaci, R; Goddard, B; Mereghetti, A; Schmidt, R; Vlachoudis, V; CERN. Geneva. ATS Department

    2011-01-01

    The LHC is a record-breaking machine for beam energy and intensity. An intense effort has therefore been deployed in simulating critical operational scenarios of energy deposition. Using FLUKA Monte Carlo simulations, we have investigated the effects of an asynchronous beam dump at the LHC Point 6 where beams, with a stored energy of 360 MJ, can instantaneously release up to a few J cm^-3 in the cryogenic magnets which have a quench limit of the order of the mJ cm^-3. In the present paper we will describe the simulation approach, and discuss the evaluated maximum energy release onto the superconducting magnets during an asynchronous beam dump. We will then analyze the shielding provided by collimators installed in the area and discuss safety limits for the operation of the LHC.

  10. Calculation of electron and isotopes dose point kernels with FLUKA Monte Carlo code for dosimetry in nuclear medicine therapy

    CERN Document Server

    Mairani, A; Valente, M; Battistoni, G; Botta, F; Pedroli, G; Ferrari, A; Cremonesi, M; Di Dia, A; Ferrari, M; Fasso, A

    2011-01-01

    Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, FLUKA Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, FLUKA has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernel (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: FLUKA DPKS have been calculated in both water and compact bone for monoenergetic electrons (10-3 MeV) and for beta emitting isotopes commonly used for therapy ((89)Sr, (90)Y, (131)I, (153)Sm, (177)Lu, (186)Re, and (188)Re). Point isotropic...

  11. Benchmarking nuclear models of FLUKA and GEANT4 for carbon ion therapy

    CERN Document Server

    Bohlen, TT; Quesada, J M; Bohlen, T T; Cerutti, F; Gudowska, I; Ferrari, A; Mairani, A

    2010-01-01

    As carbon ions, at therapeutic energies, penetrate tissue, they undergo inelastic nuclear reactions and give rise to significant yields of secondary fragment fluences. Therefore, an accurate prediction of these fluences resulting from the primary carbon interactions is necessary in the patient's body in order to precisely simulate the spatial dose distribution and the resulting biological effect. In this paper, the performance of nuclear fragmentation models of the Monte Carlo transport codes, FLUKA and GEANT4, in tissue-like media and for an energy regime relevant for therapeutic carbon ions is investigated. The ability of these Monte Carlo codes to reproduce experimental data of charge-changing cross sections and integral and differential yields of secondary charged fragments is evaluated. For the fragment yields, the main focus is on the consideration of experimental approximations and uncertainties such as the energy measurement by time-of-flight. For GEANT4, the hadronic models G4BinaryLightIonReaction a...

  12. Applications of FLUKA Monte Carlo code for nuclear and accelerator physics

    CERN Document Server

    Battistoni, Giuseppe; Brugger, Markus; Campanella, Mauro; Carboni, Massimo; Empl, Anton; Fasso, Alberto; Gadioli, Ettore; Cerutti, Francesco; Ferrari, Alfredo; Ferrari, Anna; Lantz, Matthias; Mairani, Andrea; Margiotta, M; Morone, Christina; Muraro, Silvia; Parodi, Katerina; Patera, Vincenzo; Pelliccioni, Maurizio; Pinsky, Lawrence; Ranft, Johannes; Roesler, Stefan; Rollet, Sofia; Sala, Paola R; Santana, Mario; Sarchiapone, Lucia; Sioli, Maximiliano; Smirnov, George; Sommerer, Florian; Theis, Christian; Trovati, Stefania; Villari, R; Vincke, Heinz; Vincke, Helmut; Vlachoudis, Vasilis; Vollaire, Joachim; Zapp, Neil

    2011-01-01

    FLUKA is a general purpose Monte Carlo code capable of handling all radiation components from thermal energies (for neutrons) or 1keV (for all other particles) to cosmic ray energies and can be applied in many different fields. Presently the code is maintained on Linux. The validity of the physical models implemented in FLUKA has been benchmarked against a variety of experimental data over a wide energy range, from accelerator data to cosmic ray showers in the Earth atmosphere. FLUKA is widely used for studies related both to basic research and to applications in particle accelerators, radiation protection and dosimetry, including the specific issue of radiation damage in space missions, radiobiology (including radiotherapy) and cosmic ray calculations. After a short description of the main features that make FLUKA valuable for these topics, the present paper summarizes some of the recent applications of the FLUKA Monte Carlo code in the nuclear as well high energy physics. In particular it addresses such top...

  13. The FLUKA code for space applications Recent developments

    CERN Document Server

    Andersen, V; Battistoni, G; Campanella, M; Carboni, M; Cerutti, F; Empl, A; Fassò, A; Ferrari, A; Gadioli, E; Garzelli, M V; Lee, K; Ottolenghi, A; Pelliccioni, M; Pinsky, L S; Ranft, J; Roesler, S; Sala, P R; Wilson, T L

    2004-01-01

    The FLUKA Monte Carlo transport code is widely used for fundamental research, radioprotection and dosimetry, hybrid nuclear energy system and cosmic ray calculations. The validity of its physical models has been benchmarked against a variety of experimental data over a wide range of energies, ranging from accelerator data to cosmic ray showers in the earth atmosphere. The code is presently undergoing several developments in order to better fit the needs of space applications. The generation of particle spectra according to up-to- date cosmic ray data as well as the effect of the solar and geomagnetic modulation have been implemented and already successfully applied to a variety of problems. The implementation of suitable models for heavy ion nuclear interactions has reached an operational stage. At medium/high energy FLUKA is using the DPMJET model. The major task of incorporating heavy ion interactions from a few GeV/n down to the threshold for inelastic collisions is also progressing and promising results h...

  14. Procedures used during the verification of shielding and access-ways at CERN's Large Hadron Collider (LHC) using the FLUKA code

    International Nuclear Information System (INIS)

    Ferrari, A.; Huhtinen, M.; Rollet, S.; Stevenson, G.R.

    1997-01-01

    Several examples will be given which illustrate the special features of the Monte-Carlo cascade simulation program FLUKA, used in the verification studies of shielding for the LHC. These include the use of different estimators for dose equivalent, region importance weighting with particle splitting, Russian Roulette and weight windows both at region boundaries and m secondary production at inelastic reactions and decay-length biasing in order to favour secondary particle production. (author)

  15. FLUKA Studies of the Asynchronous Beam Dump Effects on LHC Point 6

    CERN Document Server

    Versaci, R; Goddard, B; Schmidt, R; Vlachoudis, V; Mereghetti, A

    2011-01-01

    The LHC is a record-breaking machine for beam energy and intensity. An intense effort has therefore been deployed in simulating critical operational scenarios of energy deposition. FLUKA is the most widely used code for this kind of simulations at CERN because of the high reliability of its results and the ease to custom detailed simulations all along hundreds of meters of beam line. We have investigated the effects of an asynchronous beam dump on the LHC Point 6 where, beams with a stored energy of 360 MJ, can instantaneously release up to a few J cm−3 in the cryogenic magnets which have a quench limit of the order of the mJ cm−3. In the present paper we will describe the simulation approach, and discuss the evaluated maximum energy release onto the superconducting magnets during an asynchronous beam dump. We will then analyse the shielding provided by collimators installed in the area and discuss safety limits for the operation of the LHC.

  16. Benchmarking Heavy Ion Transport Codes FLUKA, HETC-HEDS MARS15, MCNPX, and PHITS

    Energy Technology Data Exchange (ETDEWEB)

    Ronningen, Reginald Martin [Michigan State University; Remec, Igor [Oak Ridge National Laboratory; Heilbronn, Lawrence H. [University of Tennessee-Knoxville

    2013-06-07

    Powerful accelerators such as spallation neutron sources, muon-collider/neutrino facilities, and rare isotope beam facilities must be designed with the consideration that they handle the beam power reliably and safely, and they must be optimized to yield maximum performance relative to their design requirements. The simulation codes used for design purposes must produce reliable results. If not, component and facility designs can become costly, have limited lifetime and usefulness, and could even be unsafe. The objective of this proposal is to assess the performance of the currently available codes PHITS, FLUKA, MARS15, MCNPX, and HETC-HEDS that could be used for design simulations involving heavy ion transport. We plan to access their performance by performing simulations and comparing results against experimental data of benchmark quality. Quantitative knowledge of the biases and the uncertainties of the simulations is essential as this potentially impacts the safe, reliable and cost effective design of any future radioactive ion beam facility. Further benchmarking of heavy-ion transport codes was one of the actions recommended in the Report of the 2003 RIA R&D Workshop".

  17. Benchmark of the FLUKA model of crystal channeling against the UA9-H8 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schoofs, P.; Cerutti, F.; Ferrari, A. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Smirnov, G. [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Joint Institute for Nuclear Research (JINR), Dubna (Russian Federation)

    2015-07-15

    Channeling in bent crystals is increasingly considered as an option for the collimation of high-energy particle beams. The installation of crystals in the LHC has taken place during this past year and aims at demonstrating the feasibility of crystal collimation and a possible cleaning efficiency improvement. The performance of CERN collimation insertions is evaluated with the Monte Carlo code FLUKA, which is capable to simulate energy deposition in collimators as well as beam loss monitor signals. A new model of crystal channeling was developed specifically so that similar simulations can be conducted in the case of crystal-assisted collimation. In this paper, most recent results of this model are brought forward in the framework of a joint activity inside the UA9 collaboration to benchmark the different simulation tools available. The performance of crystal STF 45, produced at INFN Ferrara, was measured at the H8 beamline at CERN in 2010 and serves as the basis to the comparison. Distributions of deflected particles are shown to be in very good agreement with experimental data. Calculated dechanneling lengths and crystal performance in the transition region between amorphous regime and volume reflection are also close to the measured ones.

  18. The FLUKA code: developments and challenges for high energy and medicalapplications

    Czech Academy of Sciences Publication Activity Database

    Böhlen, T.T.; Cerutti, F.; Chin, M.P.W.; Fasso, Alberto; Ferrari, A.; Ortega, P.G.; Mairani, A.; Sala, P.R.; Smirnov, G.; Vlachoudis, V.

    2014-01-01

    Roč. 120, Jul (2014), s. 211-214 ISSN 0090-3752 Institutional support: RVO:68378271 Keywords : FLUKA * radioprotection * beams * ions Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 4.571, year: 2014

  19. Fluka Studies of the Asynchronous Beam Dump Effects on LHC Point 6 for a 7 TeV beam

    CERN Document Server

    VERSACI, R; GODDARD, B; MEREGHETTI, A; SCHMIDT, R; VLACHOUDIS, V

    2012-01-01

    The LHC is a record-breaking machine for beam energy and intensity. An intense effort has therefore been deployed in simulating critical operational scenarios of energy deposition. Using FLUKA Monte Carlo simulations, we have investigated the effects of an asynchronous beam dump at the LHC Point 6 where beams, with a stored energy of 360 MJ, can instantaneously release up to a few J cm^{-3} in the cryogenic magnets which have a quench limit of the order of the mJ cm^{-3}. In the present paper we will describe the simulation approach, and discuss the evaluated maximum energy release onto the superconducting magnets during an asynchronous beam dump of a 7 TeV beam. We will then analyze the shielding provided by collimators installed in the area and discuss safety limits for the operation of the LHC.

  20. The FLUKA Monte Carlo code coupled with the local effect model for biological calculations in carbon ion therapy

    Energy Technology Data Exchange (ETDEWEB)

    Mairani, A [University of Pavia, Department of Nuclear and Theoretical Physics, and INFN, via Bassi 6, 27100 Pavia (Italy); Brons, S; Parodi, K [Heidelberg Ion Beam Therapy Center and Department of Radiation Oncology, Im Neuenheimer Feld 450, 69120 Heidelberg (Germany); Cerutti, F; Ferrari, A; Sommerer, F [CERN, 1211 Geneva 23 (Switzerland); Fasso, A [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Kraemer, M; Scholz, M, E-mail: Andrea.Mairani@mi.infn.i [GSI Biophysik, Planck-Str. 1, D-64291 Darmstadt (Germany)

    2010-08-07

    Clinical Monte Carlo (MC) calculations for carbon ion therapy have to provide absorbed and RBE-weighted dose. The latter is defined as the product of the dose and the relative biological effectiveness (RBE). At the GSI Helmholtzzentrum fuer Schwerionenforschung as well as at the Heidelberg Ion Therapy Center (HIT), the RBE values are calculated according to the local effect model (LEM). In this paper, we describe the approach followed for coupling the FLUKA MC code with the LEM and its application to dose and RBE-weighted dose calculations for a superimposition of two opposed {sup 12}C ion fields as applied in therapeutic irradiations. The obtained results are compared with the available experimental data of CHO (Chinese hamster ovary) cell survival and the outcomes of the GSI analytical treatment planning code TRiP98. Some discrepancies have been observed between the analytical and MC calculations of absorbed physical dose profiles, which can be explained by the differences between the laterally integrated depth-dose distributions in water used as input basic data in TRiP98 and the FLUKA recalculated ones. On the other hand, taking into account the differences in the physical beam modeling, the FLUKA-based biological calculations of the CHO cell survival profiles are found in good agreement with the experimental data as well with the TRiP98 predictions. The developed approach that combines the MC transport/interaction capability with the same biological model as in the treatment planning system (TPS) will be used at HIT to support validation/improvement of both dose and RBE-weighted dose calculations performed by the analytical TPS.

  1. Measurement of angular distribution of neutron flux for the 6 MeV race-track microtron based pulsed neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Patil, B.J., E-mail: bjp@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India); Chavan, S.T.; Pethe, S.N.; Krishnan, R. [SAMEER, IIT Powai Campus, Mumbai 400 076 (India); Dhole, S.D., E-mail: sanjay@physics.unipune.ernet.i [Department of Physics, University of Pune, Pune 411 007 (India)

    2010-09-15

    The 6 MeV race track microtron based pulsed neutron source has been designed specifically for the elemental analysis of short lived activation products, where the low neutron flux requirement is desirable. Electrons impinges on a e-{gamma} target to generate bremsstrahlung radiations, which further produces neutrons by photonuclear reaction in {gamma}-n target. The optimisation of these targets along with their spectra were estimated using FLUKA code. The measurement of neutron flux was carried out by activation of vanadium at different scattering angles. Angular distribution of neutron flux indicates that the flux decreases with increase in the angle and are in good agreement with the FLUKA simulation.

  2. Simulation-based surgical education.

    Science.gov (United States)

    Evgeniou, Evgenios; Loizou, Peter

    2013-09-01

    The reduction in time for training at the workplace has created a challenge for the traditional apprenticeship model of training. Simulation offers the opportunity for repeated practice in a safe and controlled environment, focusing on trainees and tailored to their needs. Recent technological advances have led to the development of various simulators, which have already been introduced in surgical training. The complexity and fidelity of the available simulators vary, therefore depending on our recourses we should select the appropriate simulator for the task or skill we want to teach. Educational theory informs us about the importance of context in professional learning. Simulation should therefore recreate the clinical environment and its complexity. Contemporary approaches to simulation have introduced novel ideas for teaching teamwork, communication skills and professionalism. In order for simulation-based training to be successful, simulators have to be validated appropriately and integrated in a training curriculum. Within a surgical curriculum, trainees should have protected time for simulation-based training, under appropriate supervision. Simulation-based surgical education should allow the appropriate practice of technical skills without ignoring the clinical context and must strike an adequate balance between the simulation environment and simulators. © 2012 The Authors. ANZ Journal of Surgery © 2012 Royal Australasian College of Surgeons.

  3. An integral test of FLUKA nuclear models with 160 MeV proton beams in multi-layer Faraday cups

    CERN Document Server

    Rinaldi, I; Parodi, K; Ferrari, A; Sala, P; Mairani, A

    2011-01-01

    Monte Carlo (MC) codes are useful tools to simulate the complex processes of proton beam interactions with matter. In proton therapy, nuclear reactions influence the dose distribution. Therefore, the validation of nuclear models adopted in MC codes is a critical requisite for their use in this field. A simple integral test can be performed using a multi-layer Faraday cup (MLFC). This method allows separation of the nuclear and atomic interaction processes, which are responsible for secondary particle emission and the finite primary proton range, respectively. In this work, the propagation of 160 MeV protons stopping in two MLFCs made of polyethylene and copper has been simulated by the FLUKA MC code. The calculations have been performed with and without secondary electron emission and transport, as well as charge sharing in the dielectric layers. Previous results with other codes neglected those two effects. The impact of this approximation has been investigated and found to be relevant only in the proximity ...

  4. Concrete shielding of neutron radiations of plasma focus and dose examination by FLUKA

    Science.gov (United States)

    Nemati, M. J.; Amrollahi, R.; Habibi, M.

    2013-07-01

    Plasma Focus (PF) is among those devices which are used in plasma investigations, but this device produces some dangerous radiations after each shot, which generate a hazardous area for the operators of this device; therefore, it is better for the operators to stay away as much as possible from the area, where plasma focus has been placed. In this paper FLUKA Monte Carlo simulation has been used to calculate radiations produced by a 4 kJ Amirkabir plasma focus device through different concrete shielding concepts with various thicknesses (square, labyrinth and cave concepts). The neutron yield of Amirkabir plasma focus at varying deuterium pressure (3-9 torr) and two charging voltages (11.5 and 13.5 kV) is (2.25 ± 0.2) × 108 neutrons/shot and (2.88 ± 0.29) × 108 neutrons/shot of 2.45 MeV, respectively. The most influential shield for the plasma focus device among these geometries is the labyrinth concept on four sides and the top with 20 cm concrete.

  5. Radiation protection studies for medical particle accelerators using FLUKA Monte Carlo code

    International Nuclear Information System (INIS)

    Infantino, Angelo; Mostacci, Domiziano; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Marengo, Mario

    2017-01-01

    Radiation protection (RP) in the use of medical cyclotrons involves many aspects both in the routine use and for the decommissioning of a site. Guidelines for site planning and installation, as well as for RP assessment, are given in international documents; however, the latter typically offer analytic methods of calculation of shielding and materials activation, in approximate or idealised geometry set-ups. The availability of Monte Carlo (MC) codes with accurate up-to-date libraries for transport and interaction of neutrons and charged particles at energies below 250 MeV, together with the continuously increasing power of modern computers, makes the systematic use of simulations with realistic geometries possible, yielding equipment and site-specific evaluation of the source terms, shielding requirements and all quantities relevant to RP at the same time. In this work, the well-known FLUKA MC code was used to simulate different aspects of RP in the use of biomedical accelerators, particularly for the production of medical radioisotopes. In the context of the Young Professionals Award, held at the IRPA 14 conference, only a part of the complete work is presented. In particular, the simulation of the GE PETtrace cyclotron (16.5 MeV) installed at S. Orsola-Malpighi University Hospital evaluated the effective dose distribution around the equipment; the effective number of neutrons produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of "4"1Ar. The simulations were validated, in terms of physical and transport parameters to be used at the energy range of interest, through an extensive measurement campaign of the neutron environmental dose equivalent using a rem-counter and TLD dosemeters. The validated model was then used in the design and the licensing request of a new Positron Emission Tomography facility. (authors)

  6. The FLUKA Code: Developments and Challenges for High Energy and Medical Applications

    CERN Document Server

    Böhlen, T T; Chin, M P W; Fassò, A; Ferrari, A; Ortega, P G; Mairani, A; Sala, P R; Smirnov, G; Vlachoudis, V

    2014-01-01

    The FLUKA Monte Carlo code is used extensively at CERN for all beam-machine interactions, radioprotection calculations and facility design of forthcoming projects. Such needs require the code to be consistently reliable over the entire energy range (from MeV to TeV) for all projectiles (full suite of elementary particles and heavy ions). Outside CERN, among various applications worldwide, FLUKA serves as a core tool for the HIT and CNAO hadron-therapy facilities in Europe. Therefore, medical applications further impose stringent requirements in terms of reliability and predictive power, which demands constant refinement of sophisticated nuclear models and continuous code improvement. Some of the latest developments implemented in FLUKA are presented in this paper, with particular emphasis on issues and concerns pertaining to CERN and medical applications.

  7. Applications of the LAHET simulation code to relativistic heavy ion detectors

    International Nuclear Information System (INIS)

    Waters, L.S.; Gavron, A.

    1991-01-01

    The Los Alamos High Energy Transport (LAHET) simulation code has been applied to test beam data from the lead/scintillator Participant Calorimeter of BNL AGS experiment E814. The LAHET code treats hadronic interactions with the LANL version of the Oak Ridge code HETC. LAHET has now been expanded to handle hadrons with kinetic energies greater than 5 GeV with the FLUKA code, while HETC is used exclusively below 2.0 GeV. FLUKA is phased in linearly between 2.0 and 5.0 GeV. Transport of electrons and photons is done with EGS4, and an interface to the Los Alamos HMCNP3B library based code is provided to analyze neutrons with kinetic energies less than 20 MeV. Excellent agreement is found between the test data and simulation, and results for 2.46 GeV/c protons and pions are illustrated in this article

  8. Applications of the lahet simulation code to relativistic heavy ion detectors

    Energy Technology Data Exchange (ETDEWEB)

    Waters, L.; Gavron, A. [Los Alamos National Lab., NM (United States)

    1991-12-31

    The Los Alamos High Energy Transport (LAHET) simulation code has been applied to test beam data from the lead/scintillator Participant Calorimeter of BNL AGS experiment E814. The LAHET code treats hadronic interactions with the LANL version of the Oak Ridge code HETC. LAHET has now been expanded to handle hadrons with kinetic energies greater than 5 GeV with the FLUKA code, while HETC is used exclusively below 2.0 GeV. FLUKA is phased in linearly between 2.0 and 5.0 GeV. Transport of electrons and photons is done with EGS4, and an interface to the Los Alamos HMCNP3B library based code is provided to analyze neutrons with kinetic energies less than 20 MeV. Excellent agreement is found between the test data and simulation, and results for 2.46 GeV/c protons and pions are illustrated in this article.

  9. Simulation Package based on Placet

    CERN Document Server

    D'Amico, T E; Leros, Nicolas; Schulte, Daniel

    2001-01-01

    The program PLACET is used to simulate transverse and longitudinal beam effects in the main linac, the drive-beam accelerator and the drive-beam decelerators of CLIC, as well as in the linac of CTF3. It provides different models of accelerating and decelerating structures, linear optics and thin multipoles. Several methods of beam-based alignment, including emittance tuning bumps and feedback, and different failure modes can be simulated. An interface to the beam-beam simulation code GUINEA-PIG exists. Currently, interfaces to MAD and TRANSPORT are under development and an extension to transfer lines and bunch compressors is also being made. In the future, the simulations will need to be performed by many users, which requires a simplified user interface. The paper describes the status of PLACET and plans for the futur

  10. The Fluka Linebuilder and Element Database: Tools for Building Complex Models of Accelerators Beam Lines

    CERN Document Server

    Mereghetti, A; Cerutti, F; Versaci, R; Vlachoudis, V

    2012-01-01

    Extended FLUKA models of accelerator beam lines can be extremely complex: heavy to manipulate, poorly versatile and prone to mismatched positioning. We developed a framework capable of creating the FLUKA model of an arbitrary portion of a given accelerator, starting from the optics configuration and a few other information provided by the user. The framework includes a builder (LineBuilder), an element database and a series of configuration and analysis scripts. The LineBuilder is a Python program aimed at dynamically assembling complex FLUKA models of accelerator beam lines: positions, magnetic fields and scorings are automatically set up, and geometry details such as apertures of collimators, tilting and misalignment of elements, beam pipes and tunnel geometries can be entered at user’s will. The element database (FEDB) is a collection of detailed FLUKA geometry models of machine elements. This framework has been widely used for recent LHC and SPS beam-machine interaction studies at CERN, and led to a dra...

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Inversion based on computational simulations

    International Nuclear Information System (INIS)

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-01-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal

  13. Comparison of inclusive particle production in 14.6 GeV/c proton-nucleus collisions with simulation

    International Nuclear Information System (INIS)

    Jaffe, D.E.; Lo, K.H.; Comfort, J.R.; Sivertz, M.

    2006-01-01

    Inclusive charged pion, kaon, proton and deuteron production in 14.6 GeV/c proton-nucleus collisions measured by BNL experiment E802 is compared with results from the GEANT3, GEANT4 and FLUKA simulation packages. The FLUKA package is found to have the best overall agreement

  14. The FLUKA Monte Carlo, Non-Perturbative QCD and Cosmic Ray Cascades

    International Nuclear Information System (INIS)

    Battistoni, G.

    2005-01-01

    The FLUKA Monte Carlo code, presently used in cosmic ray physics, contains packages to sample soft hadronic processes which are built according to the Dual Parton Model. This is a phenomenological model capable of reproducing many of the features of hadronic collisions in the non perturbative QCD regime. The basic principles of the model are summarized and, as an example, the associated Lambda-K production is discussed. This is a process which has some relevance for the calculation of atmospheric neutrino fluxes

  15. Space Applications of the FLUKA Monte-Carlo Code: Lunar and Planetary Exploration

    International Nuclear Information System (INIS)

    Lee, Kerry; Wilson, Thomas; Zapp, Neal; Pinsky, Lawrence

    2007-01-01

    NASA has recognized the need for making additional heavy-ion collision measurements at the U.S. Brookhaven National Laboratory in order to support further improvement of several particle physics transport-code models for space exploration applications. FLUKA has been identified as one of these codes and we will review the nature and status of this investigation as it relates to high-energy heavy-ion physics

  16. SU-E-T-323: The FLUKA Monte Carlo Code in Ion Beam Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Rinaldi, I [Heidelberg University Hospital (Germany); Ludwig-Maximilian University Munich (Germany)

    2014-06-01

    Purpose: Monte Carlo (MC) codes are increasingly used in the ion beam therapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code demands accurate and reliable physical models for the transport and the interaction of all components of the mixed radiation field. This contribution will address an overview of the recent developments in the FLUKA code oriented to its application in ion beam therapy. Methods: FLUKA is a general purpose MC code which allows the calculations of particle transport and interactions with matter, covering an extended range of applications. The user can manage the code through a graphic interface (FLAIR) developed using the Python programming language. Results: This contribution will present recent refinements in the description of the ionization processes and comparisons between FLUKA results and experimental data of ion beam therapy facilities. Moreover, several validations of the largely improved FLUKA nuclear models for imaging application to treatment monitoring will be shown. The complex calculation of prompt gamma ray emission compares favorably with experimental data and can be considered adequate for the intended applications. New features in the modeling of proton induced nuclear interactions also provide reliable cross section predictions for the production of radionuclides. Of great interest for the community are the developments introduced in FLAIR. The most recent efforts concern the capability of importing computed-tomography images in order to build automatically patient geometries and the implementation of different types of existing positron-emission-tomography scanner devices for imaging applications. Conclusion: The FLUA code has been already chosen as reference MC code in many ion beam therapy centers, and is being continuously improved in order to match the needs of ion beam therapy applications. Parts of this work have been supported by the European

  17. Design and spectrum calculation of 4H-SiC thermal neutron detectors using FLUKA and TCAD

    Science.gov (United States)

    Huang, Haili; Tang, Xiaoyan; Guo, Hui; Zhang, Yimen; Zhang, Yimeng; Zhang, Yuming

    2016-10-01

    SiC is a promising material for neutron detection in a harsh environment due to its wide band gap, high displacement threshold energy and high thermal conductivity. To increase the detection efficiency of SiC, a converter such as 6LiF or 10B is introduced. In this paper, pulse-height spectra of a PIN diode with a 6LiF conversion layer exposed to thermal neutrons (0.026 eV) are calculated using TCAD and Monte Carlo simulations. First, the conversion efficiency of a thermal neutron with respect to the thickness of 6LiF was calculated by using a FLUKA code, and a maximal efficiency of approximately 5% was achieved. Next, the energy distributions of both 3H and α induced by the 6LiF reaction according to different ranges of emission angle are analyzed. Subsequently, transient pulses generated by the bombardment of single 3H or α-particles are calculated. Finally, pulse height spectra are obtained with a detector efficiency of 4.53%. Comparisons of the simulated result with the experimental data are also presented, and the calculated spectrum shows an acceptable similarity to the experimental data. This work would be useful for radiation-sensing applications, especially for SiC detector design.

  18. Production of secondary particles and nuclei in cosmic rays collisions with the interstellar gas using the FLUKA code

    CERN Document Server

    Mazziotta, M N; Ferrari, A; Gaggero, D; Loparco, F; Sala, P R

    2016-01-01

    The measured fluxes of secondary particles produced by the interactions of Cosmic Rays (CRs) with the astronomical environment play a crucial role in understanding the physics of CR transport. In this work we present a comprehensive calculation of the secondary hadron, lepton, gamma-ray and neutrino yields produced by the inelastic interactions between several species of stable or long-lived cosmic rays projectiles (p, D, T, 3He, 4He, 6Li, 7Li, 9Be, 10Be, 10B, 11B, 12C, 13C, 14C, 14N, 15N, 16O, 17O, 18O, 20Ne, 24Mg and 28Si) and different target gas nuclei (p, 4He, 12C, 14N, 16O, 20Ne, 24Mg, 28Si and 40Ar). The yields are calculated using FLUKA, a simulation package designed to compute the energy distributions of secondary products with large accuracy in a wide energy range. The present results provide, for the first time, a complete and self-consistent set of all the relevant inclusive cross sections regarding the whole spectrum of secondary products in nuclear collisions. We cover, for the projectiles, a ki...

  19. Measurements and simulations of the radiation exposure to aircraft crew workplaces due to cosmic radiation in the atmosphere

    International Nuclear Information System (INIS)

    Beck, P.; Latocha, M.; Dorman, L.; Pelliccioni, M.; Rollet, S.

    2007-01-01

    As required by the European Directive 96/29/Euratom, radiation exposure due to natural ionizing radiation has to be taken into account at workplaces if the effective dose could become more than 1 mSv per year. An example of workers concerned by this directive is aircraft crew due to cosmic radiation exposure in the atmosphere. Extensive measurement campaigns on board aircraft have been carried out to assess ambient dose equivalent. A consortium of European dosimetry institutes within EURADOS WG5 summarized experimental data and results of calculations, together with detailed descriptions of the methods for measurements and calculations. The radiation protection quantity of interest is the effective dose, E (ISO). The comparison of results by measurements and calculations is done in terms of the operational quantity ambient dose equivalent, H*(10). This paper gives an overview of the EURADOS Aircraft Crew In-Flight Database and it presents a new empirical model describing fitting functions for this data. Furthermore, it describes numerical simulations performed with the Monte Carlo code FLUKA-2005 using an updated version of the cosmic radiation primary spectra. The ratio between ambient dose equivalent and effective dose at commercial flight altitudes, calculated with FLUKA-2005, is discussed. Finally, it presents the aviation dosimetry model AVIDOS based on FLUKA-2005 simulations for routine dose assessment. The code has been developed by Austrian Research Centers (ARC) for the public usage (http://avidos.healthphysics.at. (authors)

  20. RELAP5 based engineering simulator

    International Nuclear Information System (INIS)

    Charlton, T.R.; Laats, E.T.; Burtt, J.D.

    1990-01-01

    The INEL Engineering Simulation Center was established in 1988 to provide a modern, flexible, state-of-the-art simulation facility. This facility and two of the major projects which are part of the simulation center, the Advance Test Reactor (ATR) engineering simulator project and the Experimental Breeder Reactor II (EBR-II) advanced reactor control system, have been the subject of several papers in the past few years. Two components of the ATR engineering simulator project, RELAP5 and the Nuclear Plant Analyzer (NPA), have recently been improved significantly. This paper will present an overview of the INEL Engineering Simulation Center, and discuss the RELAP5/MOD3 and NPA/MOD1 codes, specifically how they are being used at the INEL Engineering Simulation Center. It will provide an update on the modifications to these two codes and their application to the ATR engineering simulator project, as well as, a discussion on the reactor system representation, control system modeling, two phase flow and heat transfer modeling. It will also discuss how these two codes are providing desktop, stand-alone reactor simulation. 12 refs., 2 figs

  1. RELAP5 based engineering simulator

    International Nuclear Information System (INIS)

    Charlton, T.R.; Laats, E.T.; Burtt, J.D.

    1990-01-01

    The INEL Engineering Simulation Center was established in 1988 to provide a modern, flexible, state-of-the-art simulation facility. This facility and two of the major projects which are part of the simulation center, the Advance Test Reactor (ATR) engineering simulator project and the Experimental Breeder Reactor (EBR-II) advanced reactor control system, have been the subject of several papers in the past few years. Two components of the ATR engineering simulator project, RELAP5 and the Nuclear Plant Analyzer (NPA), have recently been improved significantly. This paper presents an overview of the INEL Engineering Simulation Center, and discusses the RELAP5/MOD3 and NPA/MOD1 codes, specifically how they are being used at the INEL Engineering Simulation Center. It provides an update on the modifications to these two codes and their application to the ATR engineering simulator project, as well as, a discussion on the reactor system representation, control system modeling, two phase flow and heat transfer modeling. It will also discuss how these two codes are providing desktop, stand-alone reactor simulation

  2. Polarized positrons for the ILC. Update on simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushakov, A.; Adeyemi, O.S.; Moortgat-Pick, G. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Staufenbiel, F.; Riemann, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2012-02-15

    To achieve the extremely high luminosity for colliding electron-positron beams at the future International Linear Collider [1] (ILC) an undulator-based source with about 230 meters helical undulator and a thin titanium-alloy target rim rotated with tangential velocity of about 100 meters per second are foreseen. The very high density of heat deposited in the target has to be analyzed carefully. The energy deposited by the photon beam in the target has been calculated in FLUKA. The resulting stress in the target material after one bunch train has been simulated in ANSYS. (orig.)

  3. Inter-comparison of Dose Distributions Calculated by FLUKA, GEANT4, MCNP, and PHITS for Proton Therapy

    Science.gov (United States)

    Yang, Zi-Yi; Tsai, Pi-En; Lee, Shao-Chun; Liu, Yen-Chiang; Chen, Chin-Cheng; Sato, Tatsuhiko; Sheu, Rong-Jiun

    2017-09-01

    The dose distributions from proton pencil beam scanning were calculated by FLUKA, GEANT4, MCNP, and PHITS, in order to investigate their applicability in proton radiotherapy. The first studied case was the integrated depth dose curves (IDDCs), respectively from a 100 and a 226-MeV proton pencil beam impinging a water phantom. The calculated IDDCs agree with each other as long as each code employs 75 eV for the ionization potential of water. The second case considered a similar condition of the first case but with proton energies in a Gaussian distribution. The comparison to the measurement indicates the inter-code differences might not only due to different stopping power but also the nuclear physics models. How the physics parameter setting affect the computation time was also discussed. In the third case, the applicability of each code for pencil beam scanning was confirmed by delivering a uniform volumetric dose distribution based on the treatment plan, and the results showed general agreement between each codes, the treatment plan, and the measurement, except that some deviations were found in the penumbra region. This study has demonstrated that the selected codes are all capable of performing dose calculations for therapeutic scanning proton beams with proper physics settings.

  4. Simulation-based medical teaching and learning

    Directory of Open Access Journals (Sweden)

    Abdulmohsen H Al-Elq

    2010-01-01

    Full Text Available One of the most important steps in curriculum development is the introduction of simulation- based medical teaching and learning. Simulation is a generic term that refers to an artificial representation of a real world process to achieve educational goals through experiential learning. Simulation based medical education is defined as any educational activity that utilizes simulation aides to replicate clinical scenarios. Although medical simulation is relatively new, simulation has been used for a long time in other high risk professions such as aviation. Medical simulation allows the acquisition of clinical skills through deliberate practice rather than an apprentice style of learning. Simulation tools serve as an alternative to real patients. A trainee can make mistakes and learn from them without the fear of harming the patient. There are different types and classification of simulators and their cost vary according to the degree of their resemblance to the reality, or ′fidelity′. Simulation- based learning is expensive. However, it is cost-effective if utilized properly. Medical simulation has been found to enhance clinical competence at the undergraduate and postgraduate levels. It has also been found to have many advantages that can improve patient safety and reduce health care costs through the improvement of the medical provider′s competencies. The objective of this narrative review article is to highlight the importance of simulation as a new teaching method in undergraduate and postgraduate education.

  5. Component-based framework for subsurface simulations

    International Nuclear Information System (INIS)

    Palmer, B J; Fang, Yilin; Hammond, Glenn; Gurumoorthi, Vidhya

    2007-01-01

    Simulations in the subsurface environment represent a broad range of phenomena covering an equally broad range of scales. Developing modelling capabilities that can integrate models representing different phenomena acting at different scales present formidable challenges both from the algorithmic and computer science perspective. This paper will describe the development of an integrated framework that will be used to combine different models into a single simulation. Initial work has focused on creating two frameworks, one for performing smooth particle hydrodynamics (SPH) simulations of fluid systems, the other for performing grid-based continuum simulations of reactive subsurface flow. The SPH framework is based on a parallel code developed for doing pore scale simulations, the continuum grid-based framework is based on the STOMP (Subsurface Transport Over Multiple Phases) code developed at PNNL Future work will focus on combining the frameworks together to perform multiscale, multiphysics simulations of reactive subsurface flow

  6. Comparison of Radiation Transport Codes, HZETRN, HETC and FLUKA, Using the 1956 Webber SPE Spectrum

    Science.gov (United States)

    Heinbockel, John H.; Slaba, Tony C.; Blattnig, Steve R.; Tripathi, Ram K.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Clowdsley, Martha S.; hide

    2009-01-01

    Protection of astronauts and instrumentation from galactic cosmic rays (GCR) and solar particle events (SPE) in the harsh environment of space is of prime importance in the design of personal shielding, spacec raft, and mission planning. Early entry of radiation constraints into the design process enables optimal shielding strategies, but demands efficient and accurate tools that can be used by design engineers in every phase of an evolving space project. The radiation transport code , HZETRN, is an efficient tool for analyzing the shielding effectiveness of materials exposed to space radiation. In this paper, HZETRN is compared to the Monte Carlo codes HETC-HEDS and FLUKA, for a shield/target configuration comprised of a 20 g/sq cm Aluminum slab in front of a 30 g/cm^2 slab of water exposed to the February 1956 SPE, as mode led by the Webber spectrum. Neutron and proton fluence spectra, as well as dose and dose equivalent values, are compared at various depths in the water target. This study shows that there are many regions where HZETRN agrees with both HETC-HEDS and FLUKA for this shield/target configuration and the SPE environment. However, there are also regions where there are appreciable differences between the three computer c odes.

  7. A heterogeneous graph-based recommendation simulator

    Energy Technology Data Exchange (ETDEWEB)

    Yeonchan, Ahn [Seoul National University; Sungchan, Park [Seoul National University; Lee, Matt Sangkeun [ORNL; Sang-goo, Lee [Seoul National University

    2013-01-01

    Heterogeneous graph-based recommendation frameworks have flexibility in that they can incorporate various recommendation algorithms and various kinds of information to produce better results. In this demonstration, we present a heterogeneous graph-based recommendation simulator which enables participants to experience the flexibility of a heterogeneous graph-based recommendation method. With our system, participants can simulate various recommendation semantics by expressing the semantics via meaningful paths like User Movie User Movie. The simulator then returns the recommendation results on the fly based on the user-customized semantics using a fast Monte Carlo algorithm.

  8. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  9. Scenario-based table top simulations

    DEFF Research Database (Denmark)

    Broberg, Ole; Edwards, Kasper; Nielsen, J.

    2012-01-01

    This study developed and tested a scenario-based table top simulation method in a user-driven innovation setting. A team of researchers worked together with a user group of five medical staff members from the existing clinic. Table top simulations of a new clinic were carried out in a simple model...

  10. Simulation-based certification for cataract surgery

    DEFF Research Database (Denmark)

    Thomsen, Ann Sofia Skou; Kiilgaard, Jens Folke; Kjaerbo, Hadi

    2015-01-01

    PURPOSE: To evaluate the EyeSi(™) simulator in regard to assessing competence in cataract surgery. The primary objective was to explore all simulator metrics to establish a proficiency-based test with solid evidence. The secondary objective was to evaluate whether the skill assessment was specific...

  11. The FLUKA code for application of Monte Carlo methods to promote high precision ion beam therapy

    CERN Document Server

    Parodi, K; Cerutti, F; Ferrari, A; Mairani, A; Paganetti, H; Sommerer, F

    2010-01-01

    Monte Carlo (MC) methods are increasingly being utilized to support several aspects of commissioning and clinical operation of ion beam therapy facilities. In this contribution two emerging areas of MC applications are outlined. The value of MC modeling to promote accurate treatment planning is addressed via examples of application of the FLUKA code to proton and carbon ion therapy at the Heidelberg Ion Beam Therapy Center in Heidelberg, Germany, and at the Proton Therapy Center of Massachusetts General Hospital (MGH) Boston, USA. These include generation of basic data for input into the treatment planning system (TPS) and validation of the TPS analytical pencil-beam dose computations. Moreover, we review the implementation of PET/CT (Positron-Emission-Tomography / Computed- Tomography) imaging for in-vivo verification of proton therapy at MGH. Here, MC is used to calculate irradiation-induced positron-emitter production in tissue for comparison with the +-activity measurement in order to infer indirect infor...

  12. Water simulation for cell based sandbox games

    OpenAIRE

    Lundell, Christian

    2014-01-01

    This thesis work presents a new algorithm for simulating fluid based on the Navier-Stokes equations. The algorithm is designed for cell based sandbox games where interactivity and performance are the main priorities. The algorithm enforces mass conservation conservatively instead of enforcing a divergence free velocity field. A global scale pressure model that simulates hydrostatic pressure is used where the pressure propagates between neighboring cells. A prefix sum algorithm is used to only...

  13. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  14. Simulation-based medical education in pediatrics.

    Science.gov (United States)

    Lopreiato, Joseph O; Sawyer, Taylor

    2015-01-01

    The use of simulation-based medical education (SBME) in pediatrics has grown rapidly over the past 2 decades and is expected to continue to grow. Similar to other instructional formats used in medical education, SBME is an instructional methodology that facilitates learning. Successful use of SBME in pediatrics requires attention to basic educational principles, including the incorporation of clear learning objectives. To facilitate learning during simulation the psychological safety of the participants must be ensured, and when done correctly, SBME is a powerful tool to enhance patient safety in pediatrics. Here we provide an overview of SBME in pediatrics and review key topics in the field. We first review the tools of the trade and examine various types of simulators used in pediatric SBME, including human patient simulators, task trainers, standardized patients, and virtual reality simulation. Then we explore several uses of simulation that have been shown to lead to effective learning, including curriculum integration, feedback and debriefing, deliberate practice, mastery learning, and range of difficulty and clinical variation. Examples of how these practices have been successfully used in pediatrics are provided. Finally, we discuss the future of pediatric SBME. As a community, pediatric simulation educators and researchers have been a leading force in the advancement of simulation in medicine. As the use of SBME in pediatrics expands, we hope this perspective will serve as a guide for those interested in improving the state of pediatric SBME. Published by Elsevier Inc.

  15. Computational steering of GEM based detector simulations

    Science.gov (United States)

    Sheharyar, Ali; Bouhali, Othmane

    2017-10-01

    Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.

  16. Simulation-Based Training for Thoracoscopy

    DEFF Research Database (Denmark)

    Bjurström, Johanna Margareta; Konge, Lars; Lehnert, Per

    2013-01-01

    An increasing proportion of thoracic procedures are performed using video-assisted thoracic surgery. This minimally invasive technique places special demands on the surgeons. Using simulation-based training on artificial models or animals has been proposed to overcome the initial part of the lear......An increasing proportion of thoracic procedures are performed using video-assisted thoracic surgery. This minimally invasive technique places special demands on the surgeons. Using simulation-based training on artificial models or animals has been proposed to overcome the initial part...... of the learning curve. This study aimed to investigate the effect of simulation-based training and to compare self-guided and educator-guided training....

  17. Hockey lines for simulation-based learning.

    Science.gov (United States)

    Topps, David; Ellaway, Rachel; Kupsh, Christine

    2015-06-01

    Simulation-based health professional education is often limited in accommodating large numbers of students. Most organisations do not have enough simulation suites or staff to support growing demands. We needed to find ways to make simulation sessions more accommodating for larger groups of learners, so that more than a few individuals could be active in a simulation scenario at any one time. Moreover, we needed to make the experience meaningful for all participating learners. We used the metaphor of (ice) hockey lines and substitution 'on the fly' to effectively double the numbers of learners that can be actively engaged at once. Team players must communicate clearly, and observe keenly, so that currently playing members understand what is happening from moment to moment and incoming substitutes can take over their roles seamlessly. Most organisations do not have enough simulation suites or staff to support growing demands We found that this hockey lines approach to simulation-based team scenarios will raise learners' levels of engagement, reinforce good crew resource management (CRM) practices, enhance closed-loop communication, and help learners to understand their cognitive biases and limitations when working in high-pressure situations. During our continuing refinement of the hockey-lines approach, we developed a number of variations on the basic activity model, with various benefits and applications. Both students and teachers have been enthusiastically positive about this approach when it was introduced at our various courses and participating institutions. © 2015 John Wiley & Sons Ltd.

  18. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  19. Using FLUKA to Study Concrete Square Shield Performance in Attenuation of Neutron Radiation Produced by APF Plasma Focus Neutron Source

    Science.gov (United States)

    Nemati, M. J.; Habibi, M.; Amrollahi, R.

    2013-04-01

    In 2010, representatives from the Nuclear Engineering and physics Department of Amirkabir University of Technology (AUT) requested development of a project with the objective of determining the performance of a concrete shield for their Plasma Focus as neutron source. The project team in Laboratory of Nuclear Engineering and physics department of Amirkabir University of Technology choose some shape of shield to study on their performance with Monte Carlo code. In the present work, the capability of Monte Carlo code FLUKA will be explored to model the APF Plasma Focus, and investigating the neutron fluence on the square concrete shield in each region of problem. The physical models embedded in FLUKA are mentioned, as well as examples of benchmarking against future experimental data. As a result of this study suitable thickness of concrete for shielding APF will be considered.

  20. Simulation-based Testing of Control Software

    Energy Technology Data Exchange (ETDEWEB)

    Ozmen, Ozgur [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nutaro, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanyal, Jibonananda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olama, Mohammed M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-10

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulator can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.

  1. Bridging the gap: simulations meet knowledge bases

    Science.gov (United States)

    King, Gary W.; Morrison, Clayton T.; Westbrook, David L.; Cohen, Paul R.

    2003-09-01

    Tapir and Krill are declarative languages for specifying actions and agents, respectively, that can be executed in simulation. As such, they bridge the gap between strictly declarative knowledge bases and strictly executable code. Tapir and Krill components can be combined to produce models of activity which can answer questions about mechanisms and processes using conventional inference methods and simulation. Tapir was used in DARPA's Rapid Knowledge Formation (RKF) project to construct models of military tactics from the Army Field Manual FM3-90. These were then used to build Courses of Actions (COAs) which could be critiqued by declarative reasoning or via Monte Carlo simulation. Tapir and Krill can be read and written by non-knowledge engineers making it an excellent vehicle for Subject Matter Experts to build and critique knowledge bases.

  2. Agent-based simulation of animal behaviour

    NARCIS (Netherlands)

    C.M. Jonker (Catholijn); J. Treur

    1998-01-01

    textabstract In this paper it is shown how animal behaviour can be simulated in an agent-based manner. Different models are shown for different types of behaviour, varying from purely reactive behaviour to pro-active, social and adaptive behaviour. The compositional development method for

  3. Simulation-based summative assessments in surgery.

    Science.gov (United States)

    Szasz, Peter; Grantcharov, Teodor P; Sweet, Robert M; Korndorffer, James R; Pedowitz, Robert A; Roberts, Patricia L; Sachdeva, Ajit K

    2016-09-01

    The American College of Surgeons-Accredited Education Institutes (ACS-AEI) Consortium aims to enhance patient safety and advance surgical education through the use of cutting-edge simulation-based training and assessment methods. The annual ACS-AEI Consortium meeting provides a forum to discuss the latest simulation-based training and assessment methods and includes special panel presentations on key topics. During the 8th annual Consortium, there was a panel presentation on simulation-based summative assessments, during which experiences from across surgical disciplines were presented. The formal presentations were followed by a robust discussion between the conference attendees and the panelists. This report summarizes the panelists' presentations and their ensuing discussion with attendees. The focus of this report is on the basis for and advances in simulation-based summative assessments, the current practices employed across various surgical disciplines, and future directions that may be pursued by the ACS-AEI Consortium. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  5. 2D PIM Simulation Based on COMSOL

    DEFF Research Database (Denmark)

    Wang, Xinbo; Cui, Wanzhao; Wang, Jingyu

    2011-01-01

    Passive intermodulation (PIM) is a problematic type of nonlinear distortion en- countered in many communication systems. To analyze the PIM distortion resulting from ma- terial nonlinearity, a 2D PIM simulation method based on COMSOL is proposed in this paper. As an example, a rectangular wavegui...

  6. Benchmark of the SixTrack-Fluka Active Coupling Against the SPS Scrapers Burst Test

    CERN Multimedia

    Mereghetti, A; Cerutti, F

    2014-01-01

    The SPS scrapers are a key ingredient for the clean injection into the LHC: they cut off halo particles quite close to the beam core (e.g.~3.5 sigma) just before extraction, to minimise the risk for quenches. The improved beam parameters as envisaged by the LHC Injectors Upgrade (LIU) Project required a revision of the present system, to assess its suitability and robustness. In particular, a burst (i.e. endurance) test of the scraper blades has been carried out, with the whole bunch train being scraped at the centre (worst working conditions). In order to take into account the effect of betatron and longitudinal beam dynamics on energy deposition patterns, and nuclear and Coulomb scattering in the absorbing medium onto loss patterns, the SixTrack and Fluka codes have been coupled, profiting from the best of the refined physical models they respectively embed. The coupling envisages an active exchange of tracked particles between the two codes at each turn, and an on-line aperture check in SixTrack, in order ...

  7. Residual activity evaluation: a benchmark between ANITA, FISPACT, FLUKA and PHITS codes

    Science.gov (United States)

    Firpo, Gabriele; Viberti, Carlo Maria; Ferrari, Anna; Frisoni, Manuela

    2017-09-01

    The activity of residual nuclides dictates the radiation fields in periodic inspections/repairs (maintenance periods) and dismantling operations (decommissioning phase) of accelerator facilities (i.e., medical, industrial, research) and nuclear reactors. Therefore, the correct prediction of the material activation allows for a more accurate planning of the activities, in line with the ALARA (As Low As Reasonably Achievable) principles. The scope of the present work is to show the results of a comparison between residual total specific activity versus a set of cooling time instants (from zero up to 10 years after irradiation) as obtained by two analytical (FISPACT and ANITA) and two Monte Carlo (FLUKA and PHITS) codes, making use of their default nuclear data libraries. A set of 40 irradiating scenarios is considered, i.e. neutron and proton particles of different energies, ranging from zero to many hundreds MeV, impinging on pure elements or materials of standard composition typically used in industrial applications (namely, AISI SS316 and Portland concrete). In some cases, experimental results were also available for a more thorough benchmark.

  8. Influence of commercial (Fluka) naphthenic acids on acid volatile sulfide (AVS) production and divalent metal precipitation.

    Science.gov (United States)

    McQueen, Andrew D; Kinley, Ciera M; Rodgers, John H; Friesen, Vanessa; Bergsveinson, Jordyn; Haakensen, Monique C

    2016-12-01

    Energy-derived waters containing naphthenic acids (NAs) are complex mixtures often comprising a suite of potentially problematic constituents (e.g. organics, metals, and metalloids) that need treatment prior to beneficial use, including release to receiving aquatic systems. It has previously been suggested that NAs can have biostatic or biocidal properties that could inhibit microbially driven processes (e.g. dissimilatory sulfate reduction) used to transfer or transform metals in passive treatment systems (i.e. constructed wetlands). The overall objective of this study was to measure the effects of a commercially available (Fluka) NA on sulfate-reducing bacteria (SRB), production of sulfides (as acid-volatile sulfides [AVS]), and precipitation of divalent metals (i.e. Cu, Ni, Zn). These endpoints were assessed following 21-d aqueous exposures of NAs using bench-scale reactors. After 21-days, AVS molar concentrations were not statistically different (pAVS production was sufficient in all NA treatments to achieve ∑SEM:AVS AVS) could be used to treat metals occurring in NAs affected waters. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation

    DEFF Research Database (Denmark)

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki

    2017-01-01

    that choice of setting for simulations does not seem to influence individual and team learning. Department-based local simulation, such as simulation in-house and especially in situ simulation, leads to gains in organisational learning. The overall objectives of simulation-based education and factors......BACKGROUND: Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities...... simulations. DISCUSSION: Non-randomised studies argue that in situ simulation is more effective for educational purposes than other types of simulation settings. Conversely, the few comparison studies that exist, either randomised or retrospective, show that choice of setting does not seem to influence...

  10. A web-based virtual lighting simulator

    Energy Technology Data Exchange (ETDEWEB)

    Papamichael, Konstantinos; Lai, Judy; Fuller, Daniel; Tariq, Tara

    2002-05-06

    This paper is about a web-based ''virtual lighting simulator,'' which is intended to allow architects and lighting designers to quickly assess the effect of key parameters on the daylighting and lighting performance in various space types. The virtual lighting simulator consists of a web-based interface that allows navigation through a large database of images and data, which were generated through parametric lighting simulations. At its current form, the virtual lighting simulator has two main modules, one for daylighting and one for electric lighting. The daylighting module includes images and data for a small office space, varying most key daylighting parameters, such as window size and orientation, glazing type, surface reflectance, sky conditions, time of the year, etc. The electric lighting module includes images and data for five space types (classroom, small office, large open office, warehouse and small retail), varying key lighting parameters, such as the electric lighting system, surface reflectance, dimming/switching, etc. The computed images include perspectives and plans and are displayed in various formats to support qualitative as well as quantitative assessment. The quantitative information is in the form of iso-contour lines superimposed on the images, as well as false color images and statistical information on work plane illuminance. The qualitative information includes images that are adjusted to account for the sensitivity and adaptation of the human eye. The paper also includes a section on the major technical issues and their resolution.

  11. Physics-Based Simulations of Natural Hazards

    Science.gov (United States)

    Schultz, Kasey William

    Earthquakes and tsunamis are some of the most damaging natural disasters that we face. Just two recent events, the 2004 Indian Ocean earthquake and tsunami and the 2011 Haiti earthquake, claimed more than 400,000 lives. Despite their catastrophic impacts on society, our ability to predict these natural disasters is still very limited. The main challenge in studying the earthquake cycle is the non-linear and multi-scale properties of fault networks. Earthquakes are governed by physics across many orders of magnitude of spatial and temporal scales; from the scale of tectonic plates and their evolution over millions of years, down to the scale of rock fracturing over milliseconds to minutes at the sub-centimeter scale during an earthquake. Despite these challenges, there are useful patterns in earthquake occurrence. One such pattern, the frequency-magnitude relation, relates the number of large earthquakes to small earthquakes and forms the basis for assessing earthquake hazard. However the utility of these relations is proportional to the length of our earthquake records, and typical records span at most a few hundred years. Utilizing physics based interactions and techniques from statistical physics, earthquake simulations provide rich earthquake catalogs allowing us to measure otherwise unobservable statistics. In this dissertation I will discuss five applications of physics-based simulations of natural hazards, utilizing an earthquake simulator called Virtual Quake. The first is an overview of computing earthquake probabilities from simulations, focusing on the California fault system. The second uses simulations to help guide satellite-based earthquake monitoring methods. The third presents a new friction model for Virtual Quake and describes how we tune simulations to match reality. The fourth describes the process of turning Virtual Quake into an open source research tool. This section then focuses on a resulting collaboration using Virtual Quake for a detailed

  12. Simulation and case-based learning

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Guralnick, David

    2008-01-01

    Abstract- This paper has its origin in the authors' reflection on years of practical experiences combined with literature readings in our preparation for a workshop on learn-by-doing simulation and case-based learning to be held at the ICELW 2008 conference (the International Conference on E-Learning...... in the Workplace). The purpose of this paper is to describe the two online learning methodologies and to raise questions for future discussion. In the workshop, the organizers and participants work with and discuss differences and similarities within the two pedagogical methodologies, focusing on how...... they are applied in workplace related and e-learning contexts. In addition to the organizers, a small number of invited presenters will attend, giving demonstrations of their work within learn-by-doing simulation and cases-based learning, but still leaving ample of time for discussion among all participants....

  13. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation

    NARCIS (Netherlands)

    Sorensen, J.L.; Ostergaard, D.; Leblanc, V.; Ottesen, B.; Konge, L.; Dieckmann, P.; Vleuten, C. van der

    2017-01-01

    BACKGROUND: Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities

  14. Agent Based Modelling for Social Simulation

    OpenAIRE

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course of this project two workshops were organized. At these workshops, a wide range of experts, both ABM experts and domain experts, worked on several potential applications of ABM. The results and ins...

  15. Interactive physically-based sound simulation

    Science.gov (United States)

    Raghuvanshi, Nikunj

    The realization of interactive, immersive virtual worlds requires the ability to present a realistic audio experience that convincingly compliments their visual rendering. Physical simulation is a natural way to achieve such realism, enabling deeply immersive virtual worlds. However, physically-based sound simulation is very computationally expensive owing to the high-frequency, transient oscillations underlying audible sounds. The increasing computational power of desktop computers has served to reduce the gap between required and available computation, and it has become possible to bridge this gap further by using a combination of algorithmic improvements that exploit the physical, as well as perceptual properties of audible sounds. My thesis is a step in this direction. My dissertation concentrates on developing real-time techniques for both sub-problems of sound simulation: synthesis and propagation. Sound synthesis is concerned with generating the sounds produced by objects due to elastic surface vibrations upon interaction with the environment, such as collisions. I present novel techniques that exploit human auditory perception to simulate scenes with hundreds of sounding objects undergoing impact and rolling in real time. Sound propagation is the complementary problem of modeling the high-order scattering and diffraction of sound in an environment as it travels from source to listener. I discuss my work on a novel numerical acoustic simulator (ARD) that is hundred times faster and consumes ten times less memory than a high-accuracy finite-difference technique, allowing acoustic simulations on previously-intractable spaces, such as a cathedral, on a desktop computer. Lastly, I present my work on interactive sound propagation that leverages my ARD simulator to render the acoustics of arbitrary static scenes for multiple moving sources and listener in real time, while accounting for scene-dependent effects such as low-pass filtering and smooth attenuation

  16. Physiological Based Simulator Fidelity Design Guidance

    Science.gov (United States)

    Schnell, Thomas; Hamel, Nancy; Postnikov, Alex; Hoke, Jaclyn; McLean, Angus L. M. Thom, III

    2012-01-01

    The evolution of the role of flight simulation has reinforced assumptions in aviation that the degree of realism in a simulation system directly correlates to the training benefit, i.e., more fidelity is always better. The construct of fidelity has several dimensions, including physical fidelity, functional fidelity, and cognitive fidelity. Interaction of different fidelity dimensions has an impact on trainee immersion, presence, and transfer of training. This paper discusses research results of a recent study that investigated if physiological-based methods could be used to determine the required level of simulator fidelity. Pilots performed a relatively complex flight task consisting of mission task elements of various levels of difficulty in a fixed base flight simulator and a real fighter jet trainer aircraft. Flight runs were performed using one forward visual channel of 40 deg. field of view for the lowest level of fidelity, 120 deg. field of view for the middle level of fidelity, and unrestricted field of view and full dynamic acceleration in the real airplane. Neuro-cognitive and physiological measures were collected under these conditions using the Cognitive Avionics Tool Set (CATS) and nonlinear closed form models for workload prediction were generated based on these data for the various mission task elements. One finding of the work described herein is that simple heart rate is a relatively good predictor of cognitive workload, even for short tasks with dynamic changes in cognitive loading. Additionally, we found that models that used a wide range of physiological and neuro-cognitive measures can further boost the accuracy of the workload prediction.

  17. Simulation-based instruction of technical skills

    Science.gov (United States)

    Towne, Douglas M.; Munro, Allen

    1991-01-01

    A rapid intelligent tutoring development system (RAPIDS) was developed to facilitate the production of interactive, real-time graphical device models for use in instructing the operation and maintenance of complex systems. The tools allowed subject matter experts to produce device models by creating instances of previously defined objects and positioning them in the emerging device model. These simulation authoring functions, as well as those associated with demonstrating procedures and functional effects on the completed model, required no previous programming experience or use of frame-based instructional languages. Three large simulations were developed in RAPIDS, each involving more than a dozen screen-sized sections. Seven small, single-view applications were developed to explore the range of applicability. Three workshops were conducted to train others in the use of the authoring tools. Participants learned to employ the authoring tools in three to four days and were able to produce small working device models on the fifth day.

  18. Optimizing a Water Simulation based on Wavefront Parameter Optimization

    OpenAIRE

    Lundgren, Martin

    2017-01-01

    DICE, a Swedish game company, wanted a more realistic water simulation. Currently, most large scale water simulations used in games are based upon ocean simulation technology. These techniques falter when used in other scenarios, such as coastlines. In order to produce a more realistic simulation, a new one was created based upon the water simulation technique "Wavefront Parameter Interpolation". This technique involves a rather extensive preprocess that enables ocean simulations to have inte...

  19. Nuclear Power Reactor simulator - based training program

    International Nuclear Information System (INIS)

    Abdelwahab, S.A.S.

    2009-01-01

    nuclear power stations will continue playing a major role as an energy source for electric generation and heat production in the world. in this paper, a nuclear power reactor simulator- based training program will be presented . this program is designed to aid in training of the reactor operators about the principles of operation of the plant. also it could help the researchers and the designers to analyze and to estimate the performance of the nuclear reactors and facilitate further studies for selection of the proper controller and its optimization process as it is difficult and time consuming to do all experiments in the real nuclear environment.this program is written in MATLAB code as MATLAB software provides sophisticated tools comparable to those in other software such as visual basic for the creation of graphical user interface (GUI). moreover MATLAB is available for all major operating systems. the used SIMULINK reactor model for the nuclear reactor can be used to model different types by adopting appropriate parameters. the model of each component of the reactor is based on physical laws rather than the use of look up tables or curve fitting.this simulation based training program will improve acquisition and retention knowledge also trainee will learn faster and will have better attitude

  20. Simulation-based disassembly systems design

    Science.gov (United States)

    Ohlendorf, Martin; Herrmann, Christoph; Hesselbach, Juergen

    2004-02-01

    Recycling of Waste of Electrical and Electronic Equipment (WEEE) is a matter of actual concern, driven by economic, ecological and legislative reasons. Here, disassembly as the first step of the treatment process plays a key role. To achieve sustainable progress in WEEE disassembly, the key is not to limit analysis and planning to merely disassembly processes in a narrow sense, but to consider entire disassembly plants including additional aspects such as internal logistics, storage, sorting etc. as well. In this regard, the paper presents ways of designing, dimensioning, structuring and modeling different disassembly systems. Goal is to achieve efficient and economic disassembly systems that allow recycling processes complying with legal requirements. Moreover, advantages of applying simulation software tools that are widespread and successfully utilized in conventional industry sectors are addressed. They support systematic disassembly planning by means of simulation experiments including consecutive efficiency evaluation. Consequently, anticipatory recycling planning considering various scenarios is enabled and decisions about which types of disassembly systems evidence appropriateness for specific circumstances such as product spectrum, throughput, disassembly depth etc. is supported. Furthermore, integration of simulation based disassembly planning in a holistic concept with configuration of interfaces and data utilization including cost aspects is described.

  1. Simulation-based optimization of thermal systems

    International Nuclear Information System (INIS)

    Jaluria, Yogesh

    2009-01-01

    This paper considers the design and optimization of thermal systems on the basis of the mathematical and numerical modeling of the system. Many complexities are often encountered in practical thermal processes and systems, making the modeling challenging and involved. These include property variations, complicated regions, combined transport mechanisms, chemical reactions, and intricate boundary conditions. The paper briefly presents approaches that may be used to accurately simulate these systems. Validation of the numerical model is a particularly critical aspect and is discussed. It is important to couple the modeling with the system performance, design, control and optimization. This aspect, which has often been ignored in the literature, is considered in this paper. Design of thermal systems based on concurrent simulation and experimentation is also discussed in terms of dynamic data-driven optimization methods. Optimization of the system and of the operating conditions is needed to minimize costs and improve product quality and system performance. Different optimization strategies that are currently used for thermal systems are outlined, focusing on new and emerging strategies. Of particular interest is multi-objective optimization, since most thermal systems involve several important objective functions, such as heat transfer rate and pressure in electronic cooling systems. A few practical thermal systems are considered in greater detail to illustrate these approaches and to present typical simulation, design and optimization results

  2. SIMULATION OF SUBGRADE EMBANKMENT ON WEAK BASE

    Directory of Open Access Journals (Sweden)

    V. D. Petrenko

    2015-08-01

    Full Text Available Purpose. This article provides: the question of the sustainability of the subgrade on a weak base is considered in the paper. It is proposed to use the method of jet grouting. Investigation of the possibility of a weak base has an effect on the overall deformation of the subgrade; the identification and optimization of the parameters of subgrade based on studies using numerical simulation. Methodology. The theoretical studies of the stress-strain state of the base and subgrade embankment by modeling in the software package LIRA have been conducted to achieve this goal. Findings. After making the necessary calculations perform building fields of a subsidence, borders cramped thickness, bed’s coefficients of Pasternak and Winkler. The diagrams construction of vertical stress performs at any point of load application. Also, using the software system may perform peer review subsidence, rolls railroad tracks in natural and consolidated basis. Originality. For weak soils is the most appropriate nonlinear model of the base with the existing areas of both elastic and limit equilibrium, mixed problem of the theory of elasticity and plasticity. Practical value. By increasing the load on the weak base as a result of the second track construction, adds embankment or increasing axial load when changing the rolling stock process of sedimentation and consolidation may continue again. Therefore, one of the feasible and promising options for the design and reconstruction of embankments on weak bases is to strengthen the bases with the help of jet grouting. With the expansion of the railway infrastructure, increasing speed and weight of the rolling stock is necessary to ensure the stability of the subgrade on weak bases. LIRA software package allows you to perform all the necessary calculations for the selection of a proper way of strengthening weak bases.

  3. MCNPX simulation of proton dose distribution in homogeneous and CT phantoms

    International Nuclear Information System (INIS)

    Lee, C.C.; Lee, Y.J.; Tung, C.J.; Cheng, H.W.; Chao, T.C.

    2014-01-01

    A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R 50% ) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R 50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent R eq,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively. - Highlights: ► Proton dose simulation based on the MCNPX 2.6.0 in homogeneous and CT phantoms. ► CT number (HU) conversion to electron density based on Schneider's approach. ► Good agreement among MCNPX, GEANT4 and FLUKA codes in a homogeneous water phantom. ► Water equivalent R 50 in CT phantoms are compatible to those of NIST database

  4. Validation of FLUKA calculated cross-sections for radioisotope production in proton-on-target collisions at proton energies around 1 GeV

    CERN Document Server

    Felcini, M

    2006-01-01

    The production cross-sections of several radioisotopes induced by 1 GeV protons impinging on different target materials have been calculated using the FLUKA Monte Carlo and compared to measured cross-sections. The emphasis of this study is on the production of alpha and beta/gamma emitters of interest for activation evaluations at a research complex, such as the EURISOL complex, using several MW power proton driver at an energy of 1 GeV. The comparisons show that in most of the cases of interest for such evaluations, the FLUKA Monte Carlo reproduces radioisotope production cross-sections within less than a factor of two with respect to the measured values. This result implies that the FLUKA calculations are adequately accurate for proton induced activation estimates at a 1 GeV high power proton driver complex.

  5. Agent-Based Simulations for Project Management

    Science.gov (United States)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  6. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation.

    Science.gov (United States)

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki; Ottesen, Bent; Konge, Lars; Dieckmann, Peter; Van der Vleuten, Cees

    2017-01-21

    Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities are called in-house training. In-house training facilities can be part of hospital departments and resemble to some extent simulation centres but often have less technical equipment. In situ simulation, introduced over the past decade, mainly comprises of team-based activities and occurs in patient care units with healthcare professionals in their own working environment. Thus, this intentional blend of simulation and real working environments means that in situ simulation brings simulation to the real working environment and provides training where people work. In situ simulation can be either announced or unannounced, the latter also known as a drill. This article presents and discusses the design of SBME and the advantage and disadvantage of the different simulation settings, such as training in simulation-centres, in-house simulations in hospital departments, announced or unannounced in situ simulations. Non-randomised studies argue that in situ simulation is more effective for educational purposes than other types of simulation settings. Conversely, the few comparison studies that exist, either randomised or retrospective, show that choice of setting does not seem to influence individual or team learning. However, hospital department-based simulations, such as in-house simulation and in situ simulation, lead to a gain in organisational learning. To our knowledge no studies have compared announced and unannounced in situ simulation. The literature suggests some improved organisational learning from unannounced in situ simulation; however, unannounced in situ simulation was also found to be challenging to plan and conduct, and more stressful among participants. The importance of

  7. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  8. Designing solar thermal experiments based on simulation

    International Nuclear Information System (INIS)

    Huleihil, Mahmoud; Mazor, Gedalya

    2013-01-01

    In this study three different models to describe the temperature distribution inside a cylindrical solid body subjected to high solar irradiation were examined, beginning with the simpler approach, which is the single dimension lump system (time), progressing through the two-dimensional distributed system approach (time and vertical direction), and ending with the three-dimensional distributed system approach with azimuthally symmetry (time, vertical direction, and radial direction). The three models were introduced and solved analytically and numerically. The importance of the models and their solution was addressed. The simulations based on them might be considered as a powerful tool in designing experiments, as they make it possible to estimate the different effects of the parameters involved in these models

  9. Simulation based engineering in solid mechanics

    CERN Document Server

    Rao, J S

    2017-01-01

    This book begins with a brief historical perspective of the advent of rotating machinery in 20th century Solid Mechanics and the development of the discipline of the Strength of Materials. High Performance Computing (HPC) and Simulation Based Engineering Science (SBES) have gradually replaced the conventional approach in Design bringing science directly into engineering without approximations. A recap of the required mathematical principles is given. The science of deformation, strain and stress at a point under the application of external traction loads is next presented. Only one-dimensional structures classified as Bars (axial loads), Rods (twisting loads) and Beams (bending loads) are considered in this book. The principal stresses and strains and von Mises stress and strain that used in design of structures are next presented. Lagrangian solution was used to derive the governing differential equations consistent with assumed deformation field and solution for deformations, strains and stresses were obtai...

  10. Simulation-based education for transfusion medicine.

    Science.gov (United States)

    Morgan, Shanna; Rioux-Masse, Benjamin; Oancea, Cristina; Cohn, Claudia; Harmon, James; Konia, Mojca

    2015-04-01

    The administration of blood products is frequently determined by physicians without subspecialty training in transfusion medicine (TM). Education in TM is necessary for appropriate utilization of resources and maintaining patient safety. Our institution developed an efficient simulation-based TM course with the goal of identifying key topics that could be individualized to learners of all levels in various environments while also allowing for practice in an environment where the patient is not placed at risk. A 2.5-hour simulation-based educational activity was designed and taught to undergraduate medical students rotating through anesthesiology and TM elective rotations and to all Clinical Anesthesia Year 1 (CA-1) residents. Content and process evaluation of the activity consisted of multiple-choice tests and course evaluations. Seventy medical students and seven CA-1 residents were enrolled in the course. There was no significant difference on pretest results between medical students and CA-1 residents. The posttest results for both medical students and CA-1 residents were significantly higher than pretest results. The results of the posttest between medical students and CA-1 residents were not significantly different. The TM knowledge gap is not a trivial problem as transfusion of blood products is associated with significant risks. Innovative educational techniques are needed to address the ongoing challenges with knowledge acquisition and retention in already full curricula. Our institution developed a feasible and effective way to integrate TM into the curriculum. Educational activities, such as this, might be a way to improve the safety of transfusions. © 2014 AABB.

  11. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  12. Multibus-based parallel processor for simulation

    Science.gov (United States)

    Ogrady, E. P.; Wang, C.-H.

    1983-01-01

    A Multibus-based parallel processor simulation system is described. The system is intended to serve as a vehicle for gaining hands-on experience, testing system and application software, and evaluating parallel processor performance during development of a larger system based on the horizontal/vertical-bus interprocessor communication mechanism. The prototype system consists of up to seven Intel iSBC 86/12A single-board computers which serve as processing elements, a multiple transmission controller (MTC) designed to support system operation, and an Intel Model 225 Microcomputer Development System which serves as the user interface and input/output processor. All components are interconnected by a Multibus/IEEE 796 bus. An important characteristic of the system is that it provides a mechanism for a processing element to broadcast data to other selected processing elements. This parallel transfer capability is provided through the design of the MTC and a minor modification to the iSBC 86/12A board. The operation of the MTC, the basic hardware-level operation of the system, and pertinent details about the iSBC 86/12A and the Multibus are described.

  13. Agent Programming Languages and Logics in Agent-Based Simulation

    DEFF Research Database (Denmark)

    Larsen, John

    2018-01-01

    and social behavior, and work on verification. Agent-based simulation is an approach for simulation that also uses the notion of agents. Although agent programming languages and logics are much less used in agent-based simulation, there are successful examples with agents designed according to the BDI...

  14. Computer-Based Simulation Games in Public Administration Education

    OpenAIRE

    Kutergina Evgeniia

    2017-01-01

    Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently...

  15. Simulation-Based System Design Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The research objective is to develop, test, and implement effective and efficient simulation techniques for modeling, evaluating, and optimizing systems in order to...

  16. Simulation-Based Testing of Distributed Systems

    National Research Council Canada - National Science Library

    Rutherford, Matthew J; Carzaniga, Antonio; Wolf, Alexander L

    2006-01-01

    .... Typically written using an imperative programming language, these simulations capture basic algorithmic functionality at the same time as they focus attention on properties critical to distribution...

  17. Simulation-based training for colonoscopy

    DEFF Research Database (Denmark)

    Preisler, Louise; Svendsen, Morten Bo Søndergaard; Nerup, Nikolaj

    2015-01-01

    in colonoscopy before practicing on patients. Twenty-five physicians (10 consultants with endoscopic experience and 15 fellows with very little endoscopic experience) were tested on 2 different simulator models: a virtual-reality simulator and a physical model. Tests were repeated twice on each simulator model...... on both the models (P virtual-reality and the physical model, respectively. The established pass/fail standards failed one of the consultants (virtual-reality simulator) and allowed one fellow to pass (physical model). The 2 tested...

  18. The simulation of CAMAC system based on Windows API

    International Nuclear Information System (INIS)

    Li Lei; Song Yushou; Xi Yinyin; Yan Qiang; Liu Huilan; Li Taosheng

    2012-01-01

    Based on Windows API, a kind of design method to simulate the CAMAC System, which is commonly used in nuclear physics experiments, is developed. Using C++ object-oriented programming, the simulation is carried out in the environment of Visual Studio 2010 and the interfaces, the data-way, the control commands and the modules are simulated with the functions either user-defined or from Windows API. Applying this method, the amplifier plug AMP575A produced by ORTEC is simulated and performance experiments are studied for this simulation module. The results indicate that the simulation module can fulfill the function of pole-zero adjustment, which means this method is competent for the simulation of CAMAC System. Compared with the simulation based on LabVIEW, this way is more flexible and closer to the bottom of the system. All the works above have found a path to making the virtual instrument platform based on CAMAC system. (authors)

  19. Response of a BGO detector to photon and neutron sources simulations and measurements

    CERN Document Server

    Vincke, H H; Fabjan, Christian Wolfgang; Otto, T

    2002-01-01

    In this paper Monte Carlo simulations (FLUKA) and measurements of the response of a BGO detector are reported. %For the measurements different radioactive sources were used to irradiate the BGO crystal. For the measurements three low-energy photon emitters $\\left({}^{60}\\rm{Co},\\right.$ ${}^{54}\\rm{Mn},$ $\\left. {}^{137}\\rm{Cs}\\right)$ were used to irradiate the BGO from various distances and angles. The neutron response was measured with an Am--Be neutron source. Simulations of the experimental irradiations were carried out. Our study can also be considered as a benchmark for FLUKA in terms of its reliability to predict the detector response of a BGO scintillator.

  20. Competency-Based Training and Simulation: Making a "Valid" Argument.

    Science.gov (United States)

    Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M

    2018-02-01

    The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.

  1. Simulation-based assessment for construction helmets.

    Science.gov (United States)

    Long, James; Yang, James; Lei, Zhipeng; Liang, Daan

    2015-01-01

    In recent years, there has been a concerted effort for greater job safety in all industries. Personnel protective equipment (PPE) has been developed to help mitigate the risk of injury to humans that might be exposed to hazardous situations. The human head is the most vulnerable to impact as a moderate magnitude can cause serious injury or death. That is why industries have required the use of an industrial hard hat or helmet. There have only been a few articles published to date that are focused on the risk of head injury when wearing an industrial helmet. A full understanding of the effectiveness of construction helmets on reducing injury is lacking. This paper presents a simulation-based method to determine the threshold at which a human will sustain injury when wearing a construction helmet and assesses the risk of injury for wearers of construction helmets or hard hats. Advanced finite element, or FE, models were developed to study the impact on construction helmets. The FE model consists of two parts: the helmet and the human models. The human model consists of a brain, enclosed by a skull and an outer layer of skin. The level and probability of injury to the head was determined using both the head injury criterion (HIC) and tolerance limits set by Deck and Willinger. The HIC has been widely used to assess the likelihood of head injury in vehicles. The tolerance levels proposed by Deck and Willinger are more suited for finite element models but lack wide-scale validation. Different cases of impact were studied using LSTC's LS-DYNA.

  2. An Agent-Based Simulation Model for Organizational Analysis

    National Research Council Canada - National Science Library

    Ruan, Sui; Gokhale, Swapna S; Pattipati, Krishna R

    2006-01-01

    In many fields, including engineering, management, and organizational science, simulation-based computational organization theory has been used to gain insight into the degree of match ("congruence...

  3. An Example-Based Brain MRI Simulation Framework.

    Science.gov (United States)

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L

    2015-02-21

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  4. Simulation-based training for thoracoscopic lobectomy

    DEFF Research Database (Denmark)

    Jensen, Katrine; Ringsted, Charlotte; Hansen, Henrik Jessen

    2014-01-01

    overcome the first part of the learning curve, but no virtual-reality simulators for thoracoscopy are commercially available. This study aimed to investigate whether training on a laparoscopic simulator enables trainees to perform a thoracoscopic lobectomy. METHODS: Twenty-eight surgical residents were...... randomized to either virtual-reality training on a nephrectomy module or traditional black-box simulator training. After a retention period they performed a thoracoscopic lobectomy on a porcine model and their performance was scored using a previously validated assessment tool. RESULTS: The groups did...... not differ in age or gender. All participants were able to complete the lobectomy. The performance of the black-box group was significantly faster during the test scenario than the virtual-reality group: 26.6 min (SD 6.7 min) versus 32.7 min (SD 7.5 min). No difference existed between the two groups when...

  5. Agent-based simulation of animal behaviour

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J

    In the biological literature on animal behaviour, in addition to real experiments and field studies, also simulation experiments are a useful source of progress. Often specific mathematical modelling techniques are adopted and directly implemented in a programming language. Modelling more complex

  6. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  7. Agent-based Simulation of the Maritime Domain

    Directory of Open Access Journals (Sweden)

    O. Vaněk

    2010-01-01

    Full Text Available In this paper, a multi-agent based simulation platform is introduced that focuses on legitimate and illegitimate aspects of maritime traffic, mainly on intercontinental transport through piracy afflicted areas. The extensible architecture presented here comprises several modules controlling the simulation and the life-cycle of the agents, analyzing the simulation output and visualizing the entire simulated domain. The simulation control module is initialized by various configuration scenarios to simulate various real-world situations, such as a pirate ambush, coordinated transit through a transport corridor, or coastal fishing and local traffic. The environmental model provides a rich set of inputs for agents that use the geo-spatial data and the vessel operational characteristics for their reasoning. The agent behavior model based on finite state machines together with planning algorithms allows complex expression of agent behavior, so the resulting simulation output can serve as a substitution for real world data from the maritime domain.

  8. Budget Time: A Gender-Based Negotiation Simulation

    Science.gov (United States)

    Barkacs, Linda L.; Barkacs, Craig B.

    2017-01-01

    This article presents a gender-based negotiation simulation designed to make participants aware of gender-based stereotypes and their effect on negotiation outcomes. In this simulation, the current research on gender issues is animated via three role sheets: (a) Vice president (VP), (b) advantaged department head, and (c) disadvantaged department…

  9. PCISIM - A Simulation Tool for PCI Bus Based Systems

    DEFF Research Database (Denmark)

    Sharp, Robin

    1999-01-01

    This document describes a PCI bus simulator for use in evaluating the feasibility of system designs based on this bus.......This document describes a PCI bus simulator for use in evaluating the feasibility of system designs based on this bus....

  10. Preview-based sampling for controlling gaseous simulations

    KAUST Repository

    Huang, Ruoguan

    2011-01-01

    In this work, we describe an automated method for directing the control of a high resolution gaseous fluid simulation based on the results of a lower resolution preview simulation. Small variations in accuracy between low and high resolution grids can lead to divergent simulations, which is problematic for those wanting to achieve a desired behavior. Our goal is to provide a simple method for ensuring that the high resolution simulation matches key properties from the lower resolution simulation. We first let a user specify a fast, coarse simulation that will be used for guidance. Our automated method samples the data to be matched at various positions and scales in the simulation, or allows the user to identify key portions of the simulation to maintain. During the high resolution simulation, a matching process ensures that the properties sampled from the low resolution simulation are maintained. This matching process keeps the different resolution simulations aligned even for complex systems, and can ensure consistency of not only the velocity field, but also advected scalar values. Because the final simulation is naturally similar to the preview simulation, only minor controlling adjustments are needed, allowing a simpler control method than that used in prior keyframing approaches. Copyright © 2011 by the Association for Computing Machinery, Inc.

  11. Determining procedures for simulation-based training in radiology

    DEFF Research Database (Denmark)

    Nayahangan, Leizl Joy; Nielsen, Kristina Rue; Albrecht-Beste, Elisabeth

    2018-01-01

    , and basic abdominal ultrasound. CONCLUSION: A needs assessment identified and prioritized 13 technical procedures to include in a simulation-based curriculum. The list may be used as guide for development of training programs. KEY POINTS: • Simulation-based training can supplement training on patients......OBJECTIVES: New training modalities such as simulation are widely accepted in radiology; however, development of effective simulation-based training programs is challenging. They are often unstructured and based on convenience or coincidence. The study objective was to perform a nationwide needs...... assessment to identify and prioritize technical procedures that should be included in a simulation-based curriculum. METHODS: A needs assessment using the Delphi method was completed among 91 key leaders in radiology. Round 1 identified technical procedures that radiologists should learn. Round 2 explored...

  12. Simulation Based Optimization for World Line Card Production System

    Directory of Open Access Journals (Sweden)

    Sinan APAK

    2012-07-01

    Full Text Available Simulation based decision support system is one of the commonly used tool to examine complex production systems. The simulation approach provides process modules which can be adjusted with certain parameters by using data relatively easily obtainable in production process. World Line Card production system simulation is developed to evaluate the optimality of existing production line via using discrete event simulation model with variaty of alternative proposals. The current production system is analysed by a simulation model emphasizing the bottlenecks and the poorly utilized production line. Our analysis identified some improvements and efficient solutions for the existing system.

  13. Developing a Theory-Based Simulation Educator Resource.

    Science.gov (United States)

    Thomas, Christine M; Sievers, Lisa D; Kellgren, Molly; Manning, Sara J; Rojas, Deborah E; Gamblian, Vivian C

    2015-01-01

    The NLN Leadership Development Program for Simulation Educators 2014 faculty development group identified a lack of a common language/terminology to outline the progression of expertise of simulation educators. The group analyzed Benner's novice-to-expert model and applied its levels of experience to simulation educator growth. It established common operational categories of faculty development and used them to organize resources that support progression toward expertise. The resulting theory-based Simulator Educator Toolkit outlines levels of ability and provides quality resources to meet the diverse needs of simulation educators and team members.

  14. Simulation-based training in echocardiography.

    Science.gov (United States)

    Biswas, Monodeep; Patel, Rajendrakumar; German, Charles; Kharod, Anant; Mohamed, Ahmed; Dod, Harvinder S; Kapoor, Poonam Malhotra; Nanda, Navin C

    2016-10-01

    The knowledge gained from echocardiography is paramount for the clinician in diagnosing, interpreting, and treating various forms of disease. While cardiologists traditionally have undergone training in this imaging modality during their fellowship, many other specialties are beginning to show interest as well, including intensive care, anesthesia, and primary care trainees, in both transesophageal and transthoracic echocardiography. Advances in technology have led to the development of simulation programs accessible to trainees to help gain proficiency in the nuances of obtaining quality images, in a low stress, pressure free environment, often with a functioning ultrasound probe and mannequin that can mimic many of the pathologies seen in living patients. Although there are various training simulation programs each with their own benefits and drawbacks, it is clear that these programs are a powerful tool in educating the trainee and likely will lead to improved patient outcomes. © 2016, Wiley Periodicals, Inc.

  15. Ocean Wave Simulation Based on Wind Field.

    Directory of Open Access Journals (Sweden)

    Zhongyi Li

    Full Text Available Ocean wave simulation has a wide range of applications in movies, video games and training systems. Wind force is the main energy resource for generating ocean waves, which are the result of the interaction between wind and the ocean surface. While numerous methods to handle simulating oceans and other fluid phenomena have undergone rapid development during the past years in the field of computer graphic, few of them consider to construct ocean surface height field from the perspective of wind force driving ocean waves. We introduce wind force to the construction of the ocean surface height field through applying wind field data and wind-driven wave particles. Continual and realistic ocean waves result from the overlap of wind-driven wave particles, and a strategy was proposed to control these discrete wave particles and simulate an endless ocean surface. The results showed that the new method is capable of obtaining a realistic ocean scene under the influence of wind fields at real time rates.

  16. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  17. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  18. UAV Flight Control Based on RTX System Simulation Platform

    Directory of Open Access Journals (Sweden)

    Xiaojun Duan

    2014-03-01

    Full Text Available This paper proposes RTX and Matlab UAV flight control system simulation platform based on the advantages and disadvantages of Windows and real-time system RTX. In the simulation platform, we set the RTW toolbox configuration and modify grt_main.c in order to make simulation platform endowed with online parameter adjustment, fault injection. Meanwhile, we develop the interface of the system simulation platform by CVI, thus it makes effective and has good prospects in application. In order to improve the real-time performance of simulation system, the current computer of real-time simulation mostly use real-time operating system to solve simulation model, as well as dual- framework containing in Host and target machine. The system is complex, high cost, and generally used for the control and half of practical system simulation. For the control system designers, they expect to design control law at a computer with Windows-based environment and conduct real-time simulation. This paper proposes simulation platform for UAV flight control system based on RTX and Matlab for this demand.

  19. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  20. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    Science.gov (United States)

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  1. CUDA-based real time surgery simulation.

    Science.gov (United States)

    Liu, Youquan; De, Suvranu

    2008-01-01

    In this paper we present a general software platform that enables real time surgery simulation on the newly available compute unified device architecture (CUDA)from NVIDIA. CUDA-enabled GPUs harness the power of 128 processors which allow data parallel computations. Compared to the previous GPGPU, it is significantly more flexible with a C language interface. We report implementation of both collision detection and consequent deformation computation algorithms. Our test results indicate that the CUDA enables a twenty times speedup for collision detection and about fifteen times speedup for deformation computation on an Intel Core 2 Quad 2.66 GHz machine with GeForce 8800 GTX.

  2. Haptic Feedback for the GPU-based Surgical Simulator

    DEFF Research Database (Denmark)

    Sørensen, Thomas Sangild; Mosegaard, Jesper

    2006-01-01

    The GPU has proven to be a powerful processor to compute spring-mass based surgical simulations. It has not previously been shown however, how to effectively implement haptic interaction with a simulation running entirely on the GPU. This paper describes a method to calculate haptic feedback...... with limited performance cost. It allows easy balancing of the GPU workload between calculations of simulation, visualisation, and the haptic feedback....

  3. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  4. Spatial distribution sampling and Monte Carlo simulation of radioactive isotopes

    CERN Document Server

    Krainer, Alexander Michael

    2015-01-01

    This work focuses on the implementation of a program for random sampling of uniformly spatially distributed isotopes for Monte Carlo particle simulations and in specific FLUKA. With FLUKA it is possible to calculate the radio nuclide production in high energy fields. The decay of these nuclide, and therefore the resulting radiation field, however can only be simulated in the same geometry. This works gives the tool to simulate the decay of the produced nuclide in other geometries. With that the radiation field from an irradiated object can be simulated in arbitrary environments. The sampling of isotope mixtures was tested by simulating a 50/50 mixture of $Cs^{137}$ and $Co^{60}$. These isotopes are both well known and provide therefore a first reliable benchmark in that respect. The sampling of uniformly distributed coordinates was tested using the histogram test for various spatial distributions. The advantages and disadvantages of the program compared to standard methods are demonstrated in the real life ca...

  5. Cavitation-based hydro-fracturing simulator

    Science.gov (United States)

    Wang, Jy-An John; Wang, Hong; Ren, Fei; Cox, Thomas S.

    2016-11-22

    An apparatus 300 for simulating a pulsed pressure induced cavitation technique (PPCT) from a pressurized working fluid (F) provides laboratory research and development for enhanced geothermal systems (EGS), oil, and gas wells. A pump 304 is configured to deliver a pressurized working fluid (F) to a control valve 306, which produces a pulsed pressure wave in a test chamber 308. The pulsed pressure wave parameters are defined by the pump 304 pressure and control valve 306 cycle rate. When a working fluid (F) and a rock specimen 312 are included in the apparatus, the pulsed pressure wave causes cavitation to occur at the surface of the specimen 312, thus initiating an extensive network of fracturing surfaces and micro fissures, which are examined by researchers.

  6. Immersive Simulation in Constructivist-Based Classroom E-Learning

    Science.gov (United States)

    McHaney, Roger; Reiter, Lauren; Reychav, Iris

    2018-01-01

    This article describes the development of a simulation-based online course combining sound pedagogy, educational technology, and real world expertise to provide university students with an immersive experience in storage management systems. The course developed in this example does more than use a simulation, the entire course is delivered using a…

  7. Simulating individual-based models of epidemics in hierarchical networks

    NARCIS (Netherlands)

    Quax, R.; Bader, D.A.; Sloot, P.M.A.

    2009-01-01

    Current mathematical modeling methods for the spreading of infectious diseases are too simplified and do not scale well. We present the Simulator of Epidemic Evolution in Complex Networks (SEECN), an efficient simulator of detailed individual-based models by parameterizing separate dynamics

  8. Simulation-based modeling of building complexes construction management

    Science.gov (United States)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  9. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  10. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  11. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  12. Web-based Interactive Simulator for Rotating Machinery.

    Science.gov (United States)

    Sirohi, Vijayalaxmi

    1999-01-01

    Baroma (Balance of Rotating Machinery), the Web-based educational engineering interactive software for teaching/learning combines didactical and software ergonomical approaches. The software in tutorial form simulates a problem using Visual Interactive Simulation in graphic display, and animation is brought about through graphical user interface…

  13. Conceptual modeling for simulation-based serious gaming

    NARCIS (Netherlands)

    van der Zee, D.J.; Holkenborg, Bart; Robinson, Stewart

    2012-01-01

    In recent years many simulation-based serious games have been developed for supporting (future) managers in operations management decision making. They illustrate the high potential of using discrete event simulation for pedagogical purposes. Unfortunately, this potential does not seem to go

  14. Airway management in a bronchoscopic simulator based setting

    DEFF Research Database (Denmark)

    Graeser, Karin; Konge, Lars; Kristensen, Michael S

    2014-01-01

    BACKGROUND: Several simulation-based possibilities for training flexible optical intubation have been developed, ranging from non-anatomical phantoms to high-fidelity virtual reality simulators. These teaching devices might also be used to assess the competence of trainees before allowing them...

  15. Simulation-Based Medical Education: An Ethical Imperative.

    Science.gov (United States)

    Ziv, Amitai; Wolpe, Paul Root; Small, Stephen D.; Glick, Shimon

    2003-01-01

    Describes simulation-based learning in medical education and presents four these that make a framework for simulations: (1) best standards of care and training; (2) error management and patient safety; (3) patient autonomy; and (4) social justice and resource allocation. (SLD)

  16. SU-F-T-184: 3D Range-Modulator for Scanned Particle Therapy: Development, Monte Carlo Simulations and Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Simeonov, Y; Penchev, P; Ringbaek, T Printz [University of Applied Sciences, Institute of Medical Physics and Radiation Protection, Giessen (Germany); Brons, S [Heidelberg Ion-Beam Therapy Center (HIT), Heidelberg (Germany); Weber, U [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH, Darmstadt (Germany); Zink, K [University of Applied Sciences, Institute of Medical Physics and Radiation Protection, Giessen (Germany); University Hospital Giessen-Marburg, Marburg (Germany)

    2016-06-15

    Purpose: Active raster scanning in particle therapy results in highly conformal dose distributions. Treatment time, however, is relatively high due to the large number of different iso-energy layers used. By using only one energy and the so called 3D range-modulator irradiation times of a few seconds only can be achieved, thus making delivery of homogeneous dose to moving targets (e.g. lung cancer) more reliable. Methods: A 3D range-modulator consisting of many pins with base area of 2.25 mm2 and different lengths was developed and manufactured with rapid prototyping technique. The form of the 3D range-modulator was optimised for a spherical target volume with 5 cm diameter placed at 25 cm in a water phantom. Monte Carlo simulations using the FLUKA package were carried out to evaluate the modulating effect of the 3D range-modulator and simulate the resulting dose distribution. The fine and complicated contour form of the 3D range-modulator was taken into account by a specially programmed user routine. Additionally FLUKA was extended with the capability of intensity modulated scanning. To verify the simulation results dose measurements were carried out at the Heidelberg Ion Therapy Center (HIT) with a 400.41 MeV 12C beam. Results: The high resolution measurements show that the 3D range-modulator is capable of producing homogeneous 3D conformal dose distributions, simultaneously reducing significantly irradiation time. Measured dose is in very good agreement with the previously conducted FLUKA simulations, where slight differences were traced back to minor manufacturing deviations from the perfect optimised form. Conclusion: Combined with the advantages of very short treatment time the 3D range-modulator could be an alternative to treat small to medium sized tumours (e.g. lung metastasis) with the same conformity as full raster-scanning treatment. Further simulations and measurements of more complex cases will be conducted to investigate the full potential of the 3D

  17. A Monte Carlo transport code study of the space radiation environment using FLUKA and ROOT

    CERN Document Server

    Wilson, T; Carminati, F; Brun, R; Ferrari, A; Sala, P; Empl, A; MacGibbon, J

    2001-01-01

    We report on the progress of a current study aimed at developing a state-of-the-art Monte-Carlo computer simulation of the space radiation environment using advanced computer software techniques recently available at CERN, the European Laboratory for Particle Physics in Geneva, Switzerland. By taking the next-generation computer software appearing at CERN and adapting it to known problems in the implementation of space exploration strategies, this research is identifying changes necessary to bring these two advanced technologies together. The radiation transport tool being developed is tailored to the problem of taking measured space radiation fluxes impinging on the geometry of any particular spacecraft or planetary habitat and simulating the evolution of that flux through an accurate model of the spacecraft material. The simulation uses the latest known results in low-energy and high-energy physics. The output is a prediction of the detailed nature of the radiation environment experienced in space as well a...

  18. Design-Based Comparison of Spine Surgery Simulators: Optimizing Educational Features of Surgical Simulators.

    Science.gov (United States)

    Ryu, Won Hyung A; Mostafa, Ahmed E; Dharampal, Navjit; Sharlin, Ehud; Kopp, Gail; Jacobs, W Bradley; Hurlbert, R John; Chan, Sonny; Sutherland, Garnette R

    2017-10-01

    Simulation-based education has made its entry into surgical residency training, particularly as an adjunct to hands-on clinical experience. However, one of the ongoing challenges to wide adoption is the capacity of simulators to incorporate educational features required for effective learning. The aim of this study was to identify strengths and limitations of spine simulators to characterize design elements that are essential in enhancing resident education. We performed a mixed qualitative and quantitative cohort study with a focused survey and interviews of stakeholders in spine surgery pertaining to their experiences on 3 spine simulators. Ten participants were recruited spanning all levels of training and expertise until qualitative analysis reached saturation of themes. Participants were asked to perform lumbar pedicle screw insertion on 3 simulators. Afterward, a 10-item survey was administrated and a focused interview was conducted to explore topics pertaining to the design features of the simulators. Overall impressions of the simulators were positive with regards to their educational benefit, but our qualitative analysis revealed differing strengths and limitations. Main design strengths of the computer-based simulators were incorporation of procedural guidance and provision of performance feedback. The synthetic model excelled in achieving more realistic haptic feedback and incorporating use of actual surgical tools. Stakeholders from trainees to experts acknowledge the growing role of simulation-based education in spine surgery. However, different simulation modalities have varying design elements that augment learning in distinct ways. Characterization of these design characteristics will allow for standardization of simulation curricula in spinal surgery, optimizing educational benefit. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Computer-Based Simulation Games in Public Administration Education

    Directory of Open Access Journals (Sweden)

    Kutergina Evgeniia

    2017-12-01

    Full Text Available Computer simulation, an active learning technique, is now one of the advanced pedagogical technologies. Th e use of simulation games in the educational process allows students to gain a firsthand understanding of the processes of real life. Public- administration, public-policy and political-science courses increasingly adopt simulation games in universities worldwide. Besides person-to-person simulation games, there are computer-based simulations in public-administration education. Currently in Russia the use of computer-based simulation games in Master of Public Administration (MPA curricula is quite limited. Th is paper focuses on computer- based simulation games for students of MPA programmes. Our aim was to analyze outcomes of implementing such games in MPA curricula. We have done so by (1 developing three computer-based simulation games about allocating public finances, (2 testing the games in the learning process, and (3 conducting a posttest examination to evaluate the effect of simulation games on students’ knowledge of municipal finances. Th is study was conducted in the National Research University Higher School of Economics (HSE and in the Russian Presidential Academy of National Economy and Public Administration (RANEPA during the period of September to December 2015, in Saint Petersburg, Russia. Two groups of students were randomly selected in each university and then randomly allocated either to the experimental or the control group. In control groups (n=12 in HSE, n=13 in RANEPA students had traditional lectures. In experimental groups (n=12 in HSE, n=13 in RANEPA students played three simulation games apart from traditional lectures. Th is exploratory research shows that the use of computer-based simulation games in MPA curricula can improve students’ outcomes by 38 %. In general, the experimental groups had better performances on the post-test examination (Figure 2. Students in the HSE experimental group had 27.5 % better

  20. A new three-tier architecture design for multi-sphere neutron spectrometer with the FLUKA code

    Science.gov (United States)

    Huang, Hong; Yang, Jian-Bo; Tuo, Xian-Guo; Liu, Zhi; Wang, Qi-Biao; Wang, Xu

    2016-07-01

    The current commercially, available Bonner sphere neutron spectrometer (BSS) has high sensitivity to neutrons below 20 MeV, which causes it to be poorly placed to measure neutrons ranging from a few MeV to 100 MeV. The paper added moderator layers and the auxiliary material layer upon 3He proportional counters with FLUKA code, with a view to improve. The results showed that the responsive peaks to neutrons below 20 MeV gradually shift to higher energy region and decrease slightly with the increasing moderator thickness. On the contrary, the response for neutrons above 20 MeV was always very low until we embed auxiliary materials such as copper (Cu), lead (Pb), tungsten (W) into moderator layers. This paper chose the most suitable auxiliary material Pb to design a three-tier architecture multi-sphere neutron spectrometer (NBSS). Through calculating and comparing, the NBSS was advantageous in terms of response for 5-100 MeV and the highest response was 35.2 times the response of polyethylene (PE) ball with the same PE thickness.

  1. The FLUKA Monte Carlo code coupled with the local effect model for biological calculations in carbon ion therapy

    CERN Document Server

    Mairani, A; Kraemer, M; Sommerer, F; Parodi, K; Scholz, M; Cerutti, F; Ferrari, A; Fasso, A

    2010-01-01

    Clinical Monte Carlo (MC) calculations for carbon ion therapy have to provide absorbed and RBE-weighted dose. The latter is defined as the product of the dose and the relative biological effectiveness (RBE). At the GSI Helmholtzzentrum fur Schwerionenforschung as well as at the Heidelberg Ion Therapy Center (HIT), the RBE values are calculated according to the local effect model (LEM). In this paper, we describe the approach followed for coupling the FLUKA MC code with the LEM and its application to dose and RBE-weighted dose calculations for a superimposition of two opposed C-12 ion fields as applied in therapeutic irradiations. The obtained results are compared with the available experimental data of CHO (Chinese hamster ovary) cell survival and the outcomes of the GSI analytical treatment planning code TRiP98. Some discrepancies have been observed between the analytical and MC calculations of absorbed physical dose profiles, which can be explained by the differences between the laterally integrated depth-d...

  2. a Simulation-As Framework Facilitating Webgis Based Installation Planning

    Science.gov (United States)

    Zheng, Z.; Chang, Z. Y.; Fei, Y. F.

    2017-09-01

    Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  3. Knowledge-based simulation using object-oriented programming

    Science.gov (United States)

    Sidoran, Karen M.

    1993-01-01

    Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.

  4. Implementing effective simulation-based education to improve ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Implementing effective simulation-based education to improve maternal ... by IDRC, including the contributions IDRC is making towards Canada's maternal child ... OECD's Development Co-Operation Report highlights critical role of data to ...

  5. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  6. Use of agent based simulation for traffic safety assessment

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2008-07-01

    Full Text Available This paper describes the development of an agent based Computational Building Simulation (CBS) tool, termed KRONOS that is being used to work on advanced research questions such as traffic safety assessment and user behaviour in buildings...

  7. Rapid Development of Scenario-Based Simulations and Tutoring Systems

    National Research Council Canada - National Science Library

    Mohammed, John L; Sorensen, Barbara; Ong, James C; Li, Jian

    2005-01-01

    .... Scenario-based training, in which trainees practice handling specific situations using faithful simulations of the equipment they will use on the job has proven to be an extremely effective method...

  8. CFOA-Based Lossless and Lossy Inductance Simulators

    Directory of Open Access Journals (Sweden)

    F. Kaçar

    2011-09-01

    Full Text Available Inductance simulator is a useful component in the circuit synthesis theory especially for analog signal processing applications such as filter, chaotic oscillator design, analog phase shifters and cancellation of parasitic element. In this study, new four inductance simulator topologies employing a single current feedback operational amplifier are presented. The presented topologies require few passive components. The first topology is intended for negative inductance simulation, the second topology is for lossy series inductance, the third one is for lossy parallel inductance and the fourth topology is for negative parallel (-R (-L (-C simulation. The performance of the proposed CFOA based inductance simulators is demonstrated on both a second-order low-pass filter and inductance cancellation circuit. PSPICE simulations are given to verify the theoretical analysis.

  9. Development of training simulator based on critical assemblies test bench

    International Nuclear Information System (INIS)

    Narozhnyi, A.T.; Vorontsov, S.V.; Golubeva, O.A.; Dyudyaev, A.M.; Il'in, V.I.; Kuvshinov, M.I.; Panin, A.V.; Peshekhonov, D.P.

    2007-01-01

    When preparing critical mass experiment, multiplying system (MS) parts are assembled manually. This work is connected with maximum professional risk to personnel. Personnel training and keeping the skill of working experts is the important factor of nuclear safety maintenance. For this purpose authors develop a training simulator based on functioning critical assemblies test bench (CATB), allowing simulation of the MS assemblage using training mockups made of inert materials. The control program traces the current status of MS under simulation. A change in the assembly neutron physical parameters is mapped in readings of the regular devices. The simulator information support is provided by the computer database on physical characteristics of typical MS components The work in the training mode ensures complete simulation of real MS assemblage on the critical test bench. It makes it possible to elaborate the procedures related to CATB operation in a standard mode safely and effectively and simulate possible abnormal situations. (author)

  10. Virtual rounds: simulation-based education in procedural medicine

    Science.gov (United States)

    Shaffer, David W.; Meglan, Dwight A.; Ferrell, Margaret; Dawson, Steven L.

    1999-07-01

    Computer-based simulation is a goal for training physicians in specialties where traditional training puts patients at risk. Intuitively, interactive simulation of anatomy, pathology, and therapeutic actions should lead to shortening of the learning curve for novice or inexperienced physicians. Effective transfer of knowledge acquired in simulators must be shown for such devices to be widely accepted in the medical community. We have developed an Interventional Cardiology Training Simulator which incorporates real-time graphic interactivity coupled with haptic response, and an embedded curriculum permitting rehearsal, hypertext links, personal archiving and instructor review and testing capabilities. This linking of purely technical simulation with educational content creates a more robust educational purpose for procedural simulators.

  11. Simulator for beam-based LHC collimator alignment

    Science.gov (United States)

    Valentino, Gianluca; Aßmann, Ralph; Redaelli, Stefano; Sammut, Nicholas

    2014-02-01

    In the CERN Large Hadron Collider, collimators need to be set up to form a multistage hierarchy to ensure efficient multiturn cleaning of halo particles. Automatic algorithms were introduced during the first run to reduce the beam time required for beam-based setup, improve the alignment accuracy, and reduce the risk of human errors. Simulating the alignment procedure would allow for off-line tests of alignment policies and algorithms. A simulator was developed based on a diffusion beam model to generate the characteristic beam loss signal spike and decay produced when a collimator jaw touches the beam, which is observed in a beam loss monitor (BLM). Empirical models derived from the available measurement data are used to simulate the steady-state beam loss and crosstalk between multiple BLMs. The simulator design is presented, together with simulation results and comparison to measurement data.

  12. A MARTe based simulator for the JET Vertical Stabilization system

    Energy Technology Data Exchange (ETDEWEB)

    Bellizio, Teresa, E-mail: teresa.bellizio@unina.it [Associazione EURATOM-ENEA-CREATE, University di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy); De Tommasi, Gianmaria; Risoli, Nicola; Albanese, Raffaele [Associazione EURATOM-ENEA-CREATE, University di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy); Neto, Andre [Associacao EURATOM/IST, Inst. de Plasmas e Fusao Nuclear - Laboratorio Associado, Instituto Superior, Tecnico, P-1049-001 Lisboa (Portugal)

    2011-10-15

    Validation by means of simulation is a crucial step when developing real-time control systems. Modeling and simulation are an essential tool since the early design phase, when the control algorithms are designed and tested. This phase is commonly carried out in off-line environments such as Matlab and Simulink. A MARTe-based simulator has been recently developed to validate the new JET Vertical Stabilization (VS) system. MARTe is the multi-thread framework used at JET to deploy hard real-time control systems. This paper presents the software architecture of the MARTe-based simulator and it shows how this tool has been effectively used to evaluate the effects of Edge Localized Modes (ELMs) on the VS system. By using the simulator it is possible to analyze different plasma configurations, extrapolating the limit of the new vertical amplifier in terms of the energy of the largest rejectable ELM.

  13. Improving the performance of a filling line based on simulation

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.

    2016-08-01

    The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.

  14. Agent-based simulation in entrepreneurship research

    NARCIS (Netherlands)

    Yang, S.-J.S.; Chandra, Y.

    2009-01-01

    Agent-based modeling (ABM) has wide applications in natural and social sciences yet it has not been widely applied in entrepreneurship research. We discuss the nature of ABM, its position among conventional methodologies and then offer a roadmap for developing, testing and extending theories of

  15. MO-FG-CAMPUS-TeP3-02: Benchmarks of a Proton Relative Biological Effectiveness (RBE) Model for DNA Double Strand Break (DSB) Induction in the FLUKA, MCNP, TOPAS, and RayStation™ Treatment Planning System

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, R [University of Washington, Seattle, WA (United States); Streitmatter, S [University of Utah Hospitals, Salt Lake City, UT (United States); Traneus, E [RAYSEARCH LABORATORIES AB, Stockholm (Sweden); Moskvin, V [St. Jude Children’s Hospital, Memphis, TN (United States); Schuemann, J [Massachusetts General Hospital, Boston, MA (United States)

    2016-06-15

    Purpose: Validate implementation of a published RBE model for DSB induction (RBEDSB) in several general purpose Monte Carlo (MC) code systems and the RayStation™ treatment planning system (TPS). For protons and other light ions, DSB induction is a critical initiating molecular event that correlates well with the RBE for cell survival. Methods: An efficient algorithm to incorporate information on proton and light ion RBEDSB from the independently tested Monte Carlo Damage Simulation (MCDS) has now been integrated into MCNP (Stewart et al. PMB 60, 8249–8274, 2015), FLUKA, TOPAS and a research build of the RayStation™ TPS. To cross-validate the RBEDSB model implementation LET distributions, depth-dose and lateral (dose and RBEDSB) profiles for monodirectional monoenergetic (100 to 200 MeV) protons incident on a water phantom are compared. The effects of recoil and secondary ion production ({sub 2}H{sub +}, {sub 3}H{sub +}, {sub 3}He{sub 2+}, {sub 4}He{sub 2+}), spot size (3 and 10 mm), and transport physics on beam profiles and RBEDSB are examined. Results: Depth-dose and RBEDSB profiles among all of the MC models are in excellent agreement using a 1 mm distance criterion (width of a voxel). For a 100 MeV proton beam (10 mm spot), RBEDSB = 1.2 ± 0.03 (− 2–3%) at the tip of the Bragg peak and increases to 1.59 ± 0.3 two mm distal to the Bragg peak. RBEDSB tends to decrease as the kinetic energy of the incident proton increases. Conclusion: The model for proton RBEDSB has been accurately implemented into FLUKA, MCNP, TOPAS and the RayStation™TPS. The transport of secondary light ions (Z > 1) has a significant impact on RBEDSB, especially distal to the Bragg peak, although light ions have a small effect on (dosexRBEDSB) profiles. The ability to incorporate spatial variations in proton RBE within a TPS creates new opportunities to individualize treatment plans and increase the therapeutic ratio. Dr. Erik Traneus is employed full-time as a Research Scientist

  16. Simulation-based learning: Just like the real thing

    Directory of Open Access Journals (Sweden)

    Lateef Fatimah

    2010-01-01

    Full Text Available Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology to replace and amplify real experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals′ knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors.

  17. Simulation-based learning: Just like the real thing.

    Science.gov (United States)

    Lateef, Fatimah

    2010-10-01

    Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology) to replace and amplify real experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals' knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors.

  18. Simulations of muon-induced neutron flux at large depths underground

    International Nuclear Information System (INIS)

    Kudryavtsev, V.A.; Spooner, N.J.C.; McMillan, J.E.

    2003-01-01

    The production of neutrons by cosmic-ray muons at large depths underground is discussed. The most recent versions of the muon propagation code MUSIC, and particle transport code FLUKA are used to evaluate muon and neutron fluxes. The results of simulations are compared with experimental data

  19. Cyber-Based Turbulent Combustion Simulation

    Science.gov (United States)

    2012-02-28

    in flow-field structures between the laminar and turbulent counter-flowing fuel injection is clearly illustrated in figure 1. As a consequence , it...flame thickness by comparing with benchmark of AFRL/RZ ( UNICORN ) suppressing the oscillatory numerical behavior. These improvements in numerical...fraction with the benchmark results of AFRL/RZ. This validating base is generated by the UNICORN program on the finest mesh available and the local

  20. Colour based sorting station with Matlab simulation

    Directory of Open Access Journals (Sweden)

    Constantin Victor

    2017-01-01

    Full Text Available The paper presents the design process and manufacturing elements of a colour-based sorting station. The system is comprised of a gravitational storage, which also contains the colour sensor. Parts are extracted using a linear pneumatic motor and are fed onto an electrically driven conveyor belt. Extraction of the parts is done at 4 points, using two pneumatic motors and a geared DC motor, while the 4th position is at the end of the belt. The mechanical parts of the system are manufactured using 3D printer technology, allowing for easy modification and adaption to the geometry of different parts. The paper shows all of the stages needed to design, optimize, test and implement the proposed solution. System optimization was performed using a graphical Matlab interface which also allows for sorting algorithm optimization.

  1. Module-based Simulation System for efficient development of nuclear simulation programs

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Wakabayashi, Jiro

    1990-01-01

    Module-based Simulation System (MSS) has been developed to realize a new software environment enabling versatile dynamic simulation of a complex nuclear power plant system flexibly. Described in the paper are (i) fundamental methods utilized in MMS and its software systemization, (ii) development of human interface system to help users in generating integrated simulation programs automatically, and (iii) development of an intelligent user support system for helping users in the two phases of automatical semantic diagnosis and consultation to automatic input data setup for the MSS-generated programs. (author)

  2. Tsunami Early Warning via a Physics-Based Simulation Pipeline

    Science.gov (United States)

    Wilson, J. M.; Rundle, J. B.; Donnellan, A.; Ward, S. N.; Komjathy, A.

    2017-12-01

    Through independent efforts, physics-based simulations of earthquakes, tsunamis, and atmospheric signatures of these phenomenon have been developed. With the goal of producing tsunami forecasts and early warning tools for at-risk regions, we join these three spheres to create a simulation pipeline. The Virtual Quake simulator can produce thousands of years of synthetic seismicity on large, complex fault geometries, as well as the expected surface displacement in tsunamigenic regions. These displacements are used as initial conditions for tsunami simulators, such as Tsunami Squares, to produce catalogs of potential tsunami scenarios with probabilities. Finally, these tsunami scenarios can act as input for simulations of associated ionospheric total electron content, signals which can be detected by GNSS satellites for purposes of early warning in the event of a real tsunami. We present the most recent developments in this project.

  3. Laguna Verde simulator: A new TRAC-RT based application

    International Nuclear Information System (INIS)

    Munoz Cases, J.J.; Tanarro Onrubia, A.

    2006-01-01

    In a partnership with GSE Systems, TECNATOM is developing a full scope training simulator for Laguna Verde Unit 2 (LV2). The simulator design is based upon the current 'state-of-the art technology' regarding the simulation platform, instructor station, visualization tools, advanced thermalhydraulics and neutronics models, I/O systems and automated model building technology. When completed, LV2 simulator will achieve a remarkable level of modeling fidelity by using TECNATOM's TRAC-RT advanced thermalhydraulic code for the reactor coolant and main steam systems, and NEMO neutronic model for the reactor core calculations. These models have been utilized up to date for the development or upgrading of nine NPP simulators in Spain and abroad, with more than 8000 hours of training sessions, and have developed an excellent reputation for its robustness and high fidelity. (author)

  4. SIDH: A Game-Based Architecture for a Training Simulator

    Directory of Open Access Journals (Sweden)

    P. Backlund

    2009-01-01

    Full Text Available Game-based simulators, sometimes referred to as “lightweight” simulators, have benefits such as flexible technology and economic feasibility. In this article, we extend the notion of a game-based simulator by introducing multiple screen view and physical interaction. These features are expected to enhance immersion and fidelity. By utilizing these concepts we have constructed a training simulator for breathing apparatus entry. Game hardware and software have been used to produce the application. More important, the application itself is deliberately designed to be a game. Indeed, one important design goal is to create an entertaining and motivating experience combined with learning goals in order to create a serious game. The system has been evaluated in cooperation with the Swedish Rescue Services Agency to see which architectural features contribute to perceived fidelity. The modes of visualization and interaction as well as level design contribute to the usefulness of the system.

  5. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  6. An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator.

    Science.gov (United States)

    Wang, Runchun M; Thakur, Chetan S; van Schaik, André

    2018-01-01

    This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF) neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons). This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks.

  7. A review of computer-based simulators for ultrasound training.

    Science.gov (United States)

    Blum, Tobias; Rieger, Andreas; Navab, Nassir; Friess, Helmut; Martignoni, Marc

    2013-04-01

    Computer-based simulators for ultrasound training are a topic of recent interest. During the last 15 years, many different systems and methods have been proposed. This article provides an overview and classification of systems in this domain and a discussion of their advantages. Systems are classified and discussed according to the image simulation method, user interactions and medical applications. Computer simulation of ultrasound has one key advantage over traditional training. It enables novel training concepts, for example, through advanced visualization, case databases, and automatically generated feedback. Qualitative evaluations have mainly shown positive learning effects. However, few quantitative evaluations have been performed and long-term effects have to be examined.

  8. Modeling and simulation for micro DC motor based on simulink

    Science.gov (United States)

    Shen, Hanxin; Lei, Qiao; Chen, Wenxiang

    2017-09-01

    The micro DC motor has a large market demand but there is a lack of theoretical research for it. Through detailed analysis of the commutation process of micro DC motor commutator, based on micro DC motor electromagnetic torque equation and mechanical torque equation, with the help of Simulink toolkit, a triangle connection micro DC motor simulation model is established. By using the model, a sample micro DC motor are simulated, and an experimental measurements has been carried on the sample micro DC motor. It is found that the simulation results are consistent with theoretical analysis and experimental results.

  9. Discrete simulation system based on artificial intelligence methods

    Energy Technology Data Exchange (ETDEWEB)

    Futo, I; Szeredi, J

    1982-01-01

    A discrete event simulation system based on the AI language Prolog is presented. The system called t-Prolog extends the traditional possibilities of simulation languages toward automatic problem solving by using backtrack in time and automatic model modification depending on logical deductions. As t-Prolog is an interactive tool, the user has the possibility to interrupt the simulation run to modify the model or to force it to return to a previous state for trying possible alternatives. It admits the construction of goal-oriented or goal-seeking models with variable structure. Models are defined in a restricted version of the first order predicate calculus using Horn clauses. 21 references.

  10. Wind flow simulation over flat terrain using CFD based software

    International Nuclear Information System (INIS)

    Petrov, Peter; Terziev, Angel; Genovski, Ivan

    2009-01-01

    Velocity distribution recognition over definite place (terrain) is very important because due to that the zones with high energy potential could be defined (the fields with high velocities). This is a precondition for optimal wind turbine generators micro-sitting. In current work a simulation of the open flow over the flat terrain using the CFD based software is reviewed. The simulations are made of a real fluid flow in order to be defined the velocity fields over the terrain

  11. Quadcopter Attitude and Thrust Simulation Based on Simulink Platform

    Directory of Open Access Journals (Sweden)

    Endrowednes Kuantama

    2015-12-01

    Full Text Available Orientation of quadcopter axes relative to reference line direction of motion will result in attitude and every movement is controlled regulated by each rotor’s thrust. Mathematical equation based on Euler formula and 3D simulation using Matlab/Simulink software platform are used to model quadcopter movement. Change of attitude, position and thrust of each rotor can be seen through this simulation movement.

  12. Microprocessor-based simulator of surface ECG signals

    International Nuclear Information System (INIS)

    MartInez, A E; Rossi, E; Siri, L Nicola

    2007-01-01

    In this work, a simulator of surface electrocardiogram recorded signals (ECG) is presented. The device, based on a microcontroller and commanded by a personal computer, produces an analog signal resembling actual ECGs, not only in time course and voltage levels, but also in source impedance. The simulator is a useful tool for electrocardiograph calibration and monitoring, to incorporate as well in educational tasks and in clinical environments for early detection of faulty behaviour

  13. Enriching Triangle Mesh Animations with Physically Based Simulation.

    Science.gov (United States)

    Li, Yijing; Xu, Hongyi; Barbic, Jernej

    2017-10-01

    We present a system to combine arbitrary triangle mesh animations with physically based Finite Element Method (FEM) simulation, enabling control over the combination both in space and time. The input is a triangle mesh animation obtained using any method, such as keyframed animation, character rigging, 3D scanning, or geometric shape modeling. The input may be non-physical, crude or even incomplete. The user provides weights, specified using a minimal user interface, for how much physically based simulation should be allowed to modify the animation in any region of the model, and in time. Our system then computes a physically-based animation that is constrained to the input animation to the amount prescribed by these weights. This permits smoothly turning physics on and off over space and time, making it possible for the output to strictly follow the input, to evolve purely based on physically based simulation, and anything in between. Achieving such results requires a careful combination of several system components. We propose and analyze these components, including proper automatic creation of simulation meshes (even for non-manifold and self-colliding undeformed triangle meshes), converting triangle mesh animations into animations of the simulation mesh, and resolving collisions and self-collisions while following the input.

  14. Simulation

    DEFF Research Database (Denmark)

    Gould, Derek A; Chalmers, Nicholas; Johnson, Sheena J

    2012-01-01

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable p...... performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used....

  15. A java based simulator with user interface to simulate ventilated patients

    Directory of Open Access Journals (Sweden)

    Stehle P.

    2015-09-01

    Full Text Available Mechanical ventilation is a life-saving intervention, which despite its use on a routine basis, poses the risk of inflicting further damage to the lung tissue if ventilator settings are chosen inappropriately. Medical decision support systems may help to prevent such injuries while providing the optimal settings to reach a defined clinical goal. In order to develop and verify decision support algorithms, a test bench simulating a patient’s behaviour is needed. We propose a Java based system that allows simulation of respiratory mechanics, gas exchange and cardiovascular dynamics of a mechanically ventilated patient. The implemented models are allowed to interact and are interchangeable enabling the simulation of various clinical scenarios. Model simulations are running in real-time and show physiologically plausible results.

  16. Virtual reality based surgery simulation for endoscopic gynaecology.

    Science.gov (United States)

    Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G

    1999-01-01

    Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.

  17. Performance simulation of a MRPC-based PET imaging system

    Science.gov (United States)

    Roy, A.; Banerjee, A.; Biswas, S.; Chattopadhyay, S.; Das, G.; Saha, S.

    2014-10-01

    The less expensive and high resolution Multi-gap Resistive Plate Chamber (MRPC) opens up a new possibility to find an efficient alternative detector for the Time of Flight (TOF) based Positron Emission Tomography, where the sensitivity of the system depends largely on the time resolution of the detector. In a layered structure, suitable converters can be used to increase the photon detection efficiency. In this work, we perform a detailed GEANT4 simulation to optimize the converter thickness towards improving the efficiency of photon conversion. A Monte Carlo based procedure has been developed to simulate the time resolution of the MRPC-based system, making it possible to simulate its response for PET imaging application. The results of the test of a six-gap MRPC, operating in avalanche mode, with 22Na source have been discussed.

  18. Development of intelligent interface for simulation execution by module-based simulation system

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Mizutani, Naoki; Shimoda, Hiroshi; Wakabayashi, Jiro

    1988-01-01

    An intelligent user support for the two phases of simulation execution was newly developed for Module-based Simulation System (MSS). The MSS has been in development as a flexible simulation environment to improve software productivity in complex, large-scale dynamic simulation of nuclear power plant. The AI programing by Smalltalk-80 was applied to materialize the two user-interface programs for (i) semantic diagnosis of the simulation program generated automatically by MSS, and (ii) consultation system by which user can set up consistent numerical input data files necessary for executing a MSS-generated program. Frame theory was utilized in those interface programs to represent the four knowledge bases, which are (i) usage information on module library in MSS and MSS-generated program, and (ii) expertise knowledge on nuclear power plant analysis such as material properties and reactor system configuration. Capabilities of those interface programs were confirmed by some example practice on LMFBR reactor dynamic calculation, and it was demonstrated that the knowledge-based systemization was effective to improve software work environment. (author)

  19. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Tasić, Marija B.; Stamenković, Olivera S.; Veljković, Vlada B.

    2014-01-01

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  20. Research of Simulation in Character Animation Based on Physics Engine

    Directory of Open Access Journals (Sweden)

    Yang Yu

    2017-01-01

    Full Text Available Computer 3D character animation essentially is a product, which is combined with computer graphics and robotics, physics, mathematics, and the arts. It is based on computer hardware and graphics algorithms and related sciences rapidly developed new technologies. At present, the mainstream character animation technology is based on the artificial production of key technologies and capture frames based on the motion capture device technology. 3D character animation is widely used not only in the production of film, animation, and other commercial areas but also in virtual reality, computer-aided education, flight simulation, engineering simulation, military simulation, and other fields. In this paper, we try to study physics based character animation to solve these problems such as poor real-time interaction that appears in the character, low utilization rate, and complex production. The paper deeply studied the kinematics, dynamics technology, and production technology based on the motion data. At the same time, it analyzed ODE, PhysX, Bullet, and other variety of mainstream physics engines and studied OBB hierarchy bounding box tree, AABB hierarchical tree, and other collision detection algorithms. Finally, character animation based on ODE is implemented, which is simulation of the motion and collision process of a tricycle.

  1. Optimization Model for Web Based Multimodal Interactive Simulations.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2015-07-15

    This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.

  2. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.

    Science.gov (United States)

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo

    2016-12-13

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.

  3. The transesophageal echocardiography simulator based on computed tomography images.

    Science.gov (United States)

    Piórkowski, Adam; Kempny, Aleksander

    2013-02-01

    Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.

  4. [Simulation-based robot-assisted surgical training].

    Science.gov (United States)

    Kolontarev, K B; Govorov, A V; Rasner, P I; Sheptunov, S A; Prilepskaya, E A; Maltsev, E G; Pushkar, D Yu

    2015-12-01

    Since the first use of robotic surgical system in 2000, the robot-assisted technology has gained wide popularity throughout the world. Robot-assisted surgical training is a complex issue that requires significant efforts from students and teacher. During the last two decades, simulation-based training had received active development due to wide-spread occurrence and popularization of laparoscopic and robot-assisted surgical techniques. We performed a systematic review to identify the currently available simulators for robot-assisted surgery. We searched the Medline and Pubmed, English sources of literature data, using the following key words and phrases: "robotics", "robotic surgery", "computer assisted surgery", "simulation", "computer simulation", "virtual reality", "surgical training", and "surgical education". There were identified 565 publications, which meet the key words and phrases; 19 publications were selected for the final analysis. It was established that simulation-based training is the most promising teaching tool that can be used in the training of the next generation robotic surgeons. Today the use of simulators to train surgeons is validated. Price of devices is an obvious barrier for inclusion in the program for training of robotic surgeons, but the lack of this tool will result in a sharp increase in the duration of specialists training.

  5. Modelling and simulation-based acquisition decision support: present & future

    CSIR Research Space (South Africa)

    Naidoo, S

    2009-10-01

    Full Text Available stream_source_info Naidoo1_2009.pdf.txt stream_content_type text/plain stream_size 24551 Content-Encoding UTF-8 stream_name Naidoo1_2009.pdf.txt Content-Type text/plain; charset=UTF-8 1 Modelling & Simulation...-Based Acquisition Decision Support: Present & Future Shahen Naidoo Abstract The Ground Based Air Defence System (GBADS) Programme, of the South African Army has been applying modelling and simulation (M&S) to provide acquisition decision and doctrine...

  6. Medical simulation-based education improves medicos' clinical skills.

    Science.gov (United States)

    Wang, Zhaoming; Liu, Qiaoyu; Wang, Hai

    2013-03-01

    Clinical skill is an essential part of clinical medicine and plays quite an important role in bridging medicos and physicians. Due to the realities in China, traditional medical education is facing many challenges. There are few opportunities for students to practice their clinical skills and their dexterities are generally at a low level. Medical simulation-based education is a new teaching modality and helps to improve medicos' clinical skills to a large degree. Medical simulation-based education has many significant advantages and will be further developed and applied.

  7. A particle-based method for granular flow simulation

    KAUST Repository

    Chang, Yuanzhang; Bao, Kai; Zhu, Jian; Wu, Enhua

    2012-01-01

    We present a new particle-based method for granular flow simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke's law, is included in the momentum governing equation to handle the friction of granular materials. Viscosity force is also added to simulate the dynamic friction for the purpose of smoothing the velocity field and further maintaining the simulation stability. Benefiting from the Lagrangian nature of the SPH method, large flow deformation can be well handled easily and naturally. In addition, a signed distance field is also employed to enforce the solid boundary condition. The experimental results show that the proposed method is effective and efficient for handling the flow of granular materials, and different kinds of granular behaviors can be well simulated by adjusting just one parameter. © 2012 Science China Press and Springer-Verlag Berlin Heidelberg.

  8. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  9. A particle-based method for granular flow simulation

    KAUST Repository

    Chang, Yuanzhang

    2012-03-16

    We present a new particle-based method for granular flow simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke\\'s law, is included in the momentum governing equation to handle the friction of granular materials. Viscosity force is also added to simulate the dynamic friction for the purpose of smoothing the velocity field and further maintaining the simulation stability. Benefiting from the Lagrangian nature of the SPH method, large flow deformation can be well handled easily and naturally. In addition, a signed distance field is also employed to enforce the solid boundary condition. The experimental results show that the proposed method is effective and efficient for handling the flow of granular materials, and different kinds of granular behaviors can be well simulated by adjusting just one parameter. © 2012 Science China Press and Springer-Verlag Berlin Heidelberg.

  10. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  11. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  12. Simulation based virtual learning environment in medical genetics counseling

    DEFF Research Database (Denmark)

    Makransky, Guido; Bonde, Mads T.; Wulff, Julie S. G.

    2016-01-01

    BACKGROUND: Simulation based learning environments are designed to improve the quality of medical education by allowing students to interact with patients, diagnostic laboratory procedures, and patient data in a virtual environment. However, few studies have evaluated whether simulation based...... the perceived relevance of medical educational activities. The results suggest that simulations can help future generations of doctors transfer new understanding of disease mechanisms gained in virtual laboratory settings into everyday clinical practice....... learning environments increase students' knowledge, intrinsic motivation, and self-efficacy, and help them generalize from laboratory analyses to clinical practice and health decision-making. METHODS: An entire class of 300 University of Copenhagen first-year undergraduate students, most with a major...

  13. Simulation-based medical education: time for a pedagogical shift.

    Science.gov (United States)

    Kalaniti, Kaarthigeyan; Campbell, Douglas M

    2015-01-01

    The purpose of medical education at all levels is to prepare physicians with the knowledge and comprehensive skills, required to deliver safe and effective patient care. The traditional 'apprentice' learning model in medical education is undergoing a pedagogical shift to a 'simulation-based' learning model. Experiential learning, deliberate practice and the ability to provide immediate feedback are the primary advantages of simulation-based medical education. It is an effective way to develop new skills, identify knowledge gaps, reduce medical errors, and maintain infrequently used clinical skills even among experienced clinical teams, with the overall goal of improving patient care. Although simulation cannot replace clinical exposure as a form of experiential learning, it promotes learning without compromising patient safety. This new paradigm shift is revolutionizing medical education in the Western world. It is time that the developing countries embrace this new pedagogical shift.

  14. Comparison of GPU-Based Numerous Particles Simulation and Experiment

    International Nuclear Information System (INIS)

    Park, Sang Wook; Jun, Chul Woong; Sohn, Jeong Hyun; Lee, Jae Wook

    2014-01-01

    The dynamic behavior of numerous grains interacting with each other can be easily observed. In this study, this dynamic behavior was analyzed based on the contact between numerous grains. The discrete element method was used for analyzing the dynamic behavior of each particle and the neighboring-cell algorithm was employed for detecting their contact. The Hertzian and tangential sliding friction contact models were used for calculating the contact force acting between the particles. A GPU-based parallel program was developed for conducting the computer simulation and calculating the numerous contacts. The dam break experiment was performed to verify the simulation results. The reliability of the program was verified by comparing the results of the simulation with those of the experiment

  15. Simulation Based Data Reconciliation for Monitoring Power Plant Efficiency

    International Nuclear Information System (INIS)

    Park, Sang Jun; Heo, Gyun Young

    2010-01-01

    Power plant efficiency is analyzed by using measured values, mass/energy balance principles, and several correlations. Since the measured values can have uncertainty depending on the accuracy of instrumentation, the results of plant efficiency should definitely have uncertainty. The certainty may occur due to either the randomness or the malfunctions of a process. In order to improve the accuracy of efficiency analysis, the data reconciliation (DR) is expected as a good candidate because the mathematical algorithm of the DR is based on the first principles such as mass and energy balance considering the uncertainty of instrumentation. It should be noted that the mass and energy balance model for analyzing power plant efficiency is equivalent to a steady-state simulation of a plant system. Therefore the DR for efficiency analysis necessitates the simulation which can deal with the uncertainty of instrumentation. This study will propose the algorithm of the simulation based DR which is applicable to power plant efficiency monitoring

  16. Engineering-Based Thermal CFD Simulations on Massive Parallel Systems

    KAUST Repository

    Frisch, Jérôme

    2015-05-22

    The development of parallel Computational Fluid Dynamics (CFD) codes is a challenging task that entails efficient parallelization concepts and strategies in order to achieve good scalability values when running those codes on modern supercomputers with several thousands to millions of cores. In this paper, we present a hierarchical data structure for massive parallel computations that supports the coupling of a Navier–Stokes-based fluid flow code with the Boussinesq approximation in order to address complex thermal scenarios for energy-related assessments. The newly designed data structure is specifically designed with the idea of interactive data exploration and visualization during runtime of the simulation code; a major shortcoming of traditional high-performance computing (HPC) simulation codes. We further show and discuss speed-up values obtained on one of Germany’s top-ranked supercomputers with up to 140,000 processes and present simulation results for different engineering-based thermal problems.

  17. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  18. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  19. Porting a Java-based Brain Simulation Software to C++

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    A currently available software solution to simulate neural development is Cx3D. However, this software is Java-based, and not ideal for high performance computing. This talk presents our step-by-step porting approach, that uses SWIG as a tool to interface C++ code from Java.

  20. gemcWeb: A Cloud Based Nuclear Physics Simulation Software

    Science.gov (United States)

    Markelon, Sam

    2017-09-01

    gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.

  1. Simulating an elastic bipedal robot based on musculoskeletal modeling

    NARCIS (Netherlands)

    Bortoletto, Roberto; Sartori, Massimo; He, Fuben; Pagello, Enrico

    2012-01-01

    Many of the processes involved into the synthesis of human motion have much in common with problems found in robotics research. This paper describes the modeling and the simulation of a novel bipedal robot based on series elastic actuators [1]. The robot model takes in- spiration from the human

  2. Cost Effective Community Based Dementia Screening: A Markov Model Simulation

    Directory of Open Access Journals (Sweden)

    Erin Saito

    2014-01-01

    Full Text Available Background. Given the dementia epidemic and the increasing cost of healthcare, there is a need to assess the economic benefit of community based dementia screening programs. Materials and Methods. Markov model simulations were generated using data obtained from a community based dementia screening program over a one-year period. The models simulated yearly costs of caring for patients based on clinical transitions beginning in pre dementia and extending for 10 years. Results. A total of 93 individuals (74 female, 19 male were screened for dementia and 12 meeting clinical criteria for either mild cognitive impairment (n=7 or dementia (n=5 were identified. Assuming early therapeutic intervention beginning during the year of dementia detection, Markov model simulations demonstrated 9.8% reduction in cost of dementia care over a ten-year simulation period, primarily through increased duration in mild stages and reduced time in more costly moderate and severe stages. Discussion. Community based dementia screening can reduce healthcare costs associated with caring for demented individuals through earlier detection and treatment, resulting in proportionately reduced time in more costly advanced stages.

  3. Design Heuristics for Authentic Simulation-Based Learning Games

    Science.gov (United States)

    Ney, Muriel; Gonçalves, Celso; Balacheff, Nicolas

    2014-01-01

    Simulation games are games for learning based on a reference in the real world. We propose a model for authenticity in this context as a result of a compromise among learning, playing and realism. In the health game used to apply this model, students interact with characters in the game through phone messages, mail messages, SMS and video.…

  4. Identifying content for simulation-based curricula in urology

    DEFF Research Database (Denmark)

    Nayahangan, Leizl Joy; Hansen, Rikke Bolling; Lindorff-Larsen, Karen Gilboe

    2017-01-01

    to identify technical procedures in urology that should be included in a simulation-based curriculum for residency training. MATERIALS AND METHODS: A national needs assessment was performed using the Delphi method involving 56 experts with significant roles in the education of urologists. Round 1 identified...

  5. The afforestation problem: a heuristic method based on simulated annealing

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    1992-01-01

    This paper presents the afforestation problem, that is the location and design of new forest compartments to be planted in a given area. This optimization problem is solved by a two-step heuristic method based on simulated annealing. Tests and experiences with this method are also presented....

  6. Solution of partial differential equations by agent-based simulation

    International Nuclear Information System (INIS)

    Szilagyi, Miklos N

    2014-01-01

    The purpose of this short note is to demonstrate that partial differential equations can be quickly solved by agent-based simulation with high accuracy. There is no need for the solution of large systems of algebraic equations. This method is especially useful for quick determination of potential distributions and demonstration purposes in teaching electromagnetism. (letters and comments)

  7. Toward Developing Authentic Leadership: Team-Based Simulations

    Science.gov (United States)

    Shapira-Lishchinsky, Orly

    2014-01-01

    Although there is a consensus that authentic leadership should be an essential component in educational leadership, no study to date has ever tried to find whether team-based simulations may promote authentic leadership. The purpose of this study was to identify whether principal trainees can develop authentic leadership through ethical decision…

  8. Students' Emotions in Simulation-Based Medical Education

    Science.gov (United States)

    Keskitalo, Tuulikki; Ruokamo, Heli

    2017-01-01

    Medical education is emotionally charged for many reasons, especially the fact that simulation-based learning is designed to generate emotional experiences. However, there are very few studies that concentrate on learning and emotions, despite widespread interest in the topic, especially within healthcare education. The aim of this research is to…

  9. Fast spot-based multiscale simulations of granular drainage

    Energy Technology Data Exchange (ETDEWEB)

    Rycroft, Chris H.; Wong, Yee Lok; Bazant, Martin Z.

    2009-05-22

    We develop a multiscale simulation method for dense granular drainage, based on the recently proposed spot model, where the particle packing flows by local collective displacements in response to diffusing"spots'" of interstitial free volume. By comparing with discrete-element method (DEM) simulations of 55,000 spheres in a rectangular silo, we show that the spot simulation is able to approximately capture many features of drainage, such as packing statistics, particle mixing, and flow profiles. The spot simulation runs two to three orders of magnitude faster than DEM, making it an appropriate method for real-time control or optimization. We demonstrateextensions for modeling particle heaping and avalanching at the free surface, and for simulating the boundary layers of slower flow near walls. We show that the spot simulations are robust and flexible, by demonstrating that they can be used in both event-driven and fixed timestep approaches, and showing that the elastic relaxation step used in the model can be applied much less frequently and still create good results.

  10. An electromechanical based deformable model for soft tissue simulation.

    Science.gov (United States)

    Zhong, Yongmin; Shirinzadeh, Bijan; Smith, Julian; Gu, Chengfan

    2009-11-01

    Soft tissue deformation is of great importance to surgery simulation. Although a significant amount of research efforts have been dedicated to simulating the behaviours of soft tissues, modelling of soft tissue deformation is still a challenging problem. This paper presents a new deformable model for simulation of soft tissue deformation from the electromechanical viewpoint of soft tissues. Soft tissue deformation is formulated as a reaction-diffusion process coupled with a mechanical load. The mechanical load applied to a soft tissue to cause a deformation is incorporated into the reaction-diffusion system, and consequently distributed among mass points of the soft tissue. Reaction-diffusion of mechanical load and non-rigid mechanics of motion are combined to govern the simulation dynamics of soft tissue deformation. An improved reaction-diffusion model is developed to describe the distribution of the mechanical load in soft tissues. A three-layer artificial cellular neural network is constructed to solve the reaction-diffusion model for real-time simulation of soft tissue deformation. A gradient based method is established to derive internal forces from the distribution of the mechanical load. Integration with a haptic device has also been achieved to simulate soft tissue deformation with haptic feedback. The proposed methodology does not only predict the typical behaviours of living tissues, but it also accepts both local and large-range deformations. It also accommodates isotropic, anisotropic and inhomogeneous deformations by simple modification of diffusion coefficients.

  11. Simulation-based medical education in clinical skills laboratory.

    Science.gov (United States)

    Akaike, Masashi; Fukutomi, Miki; Nagamune, Masami; Fujimoto, Akiko; Tsuji, Akiko; Ishida, Kazuko; Iwata, Takashi

    2012-01-01

    Clinical skills laboratories have been established in medical institutions as facilities for simulation-based medical education (SBME). SBME is believed to be superior to the traditional style of medical education from the viewpoint of the active and adult learning theories. SBME can provide a learning cycle of debriefing and feedback for learners as well as evaluation of procedures and competency. SBME offers both learners and patients a safe environment for practice and error. In a full-environment simulation, learners can obtain not only technical skills but also non-technical skills, such as leadership, team work, communication, situation awareness, decision-making, and awareness of personal limitations. SBME is also effective for integration of clinical medicine and basic medicine. In addition, technology-enhanced simulation training is associated with beneficial effects for outcomes of knowledge, skills, behaviors, and patient-related outcomes. To perform SBME, effectively, not only simulators including high-fidelity mannequin-type simulators or virtual-reality simulators but also full-time faculties and instructors as professionals of SBME are essential in a clinical skills laboratory for SBME. Clinical skills laboratory is expected to become an integrated medical education center to achieve continuing professional development, integrated learning of basic and clinical medicine, and citizens' participation and cooperation in medical education.

  12. Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4

    International Nuclear Information System (INIS)

    Sterpin, E.; Sorriaux, J.; Vynckier, S.

    2013-01-01

    Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for 1 H and ICRU 63 data for 12 C, 14 N, 16 O, 31 P, and 40 Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth–dose distributions). The agreement is much

  13. Extension of PENELOPE to protons: simulation of nuclear reactions and benchmark with Geant4.

    Science.gov (United States)

    Sterpin, E; Sorriaux, J; Vynckier, S

    2013-11-01

    Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4. PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer-Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for (1)H and ICRU 63 data for (12)C, (14)N, (16)O, (31)P, and (40)Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth-dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth-dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone. For simulations with EM collisions only, integral depth-dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth-dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth-dose distributions). The agreement is much better with FLUKA, with deviations within

  14. Dynamic Garment Simulation based on Hybrid Bounding Volume Hierarchy

    Directory of Open Access Journals (Sweden)

    Zhu Dongyong

    2016-12-01

    Full Text Available In order to solve the computing speed and efficiency problem of existing dynamic clothing simulation, this paper presents a dynamic garment simulation based on a hybrid bounding volume hierarchy. It firstly uses MCASG graph theory to do the primary segmentation for a given three-dimensional human body model. And then it applies K-means cluster to do the secondary segmentation to collect the human body’s upper arms, lower arms, upper legs, lower legs, trunk, hip and woman’s chest as the elementary units of dynamic clothing simulation. According to different shapes of these elementary units, it chooses the closest and most efficient hybrid bounding box to specify these units, such as cylinder bounding box and elliptic cylinder bounding box. During the process of constructing these bounding boxes, it uses the least squares method and slices of the human body to get the related parameters. This approach makes it possible to use the least amount of bounding boxes to create close collision detection regions for the appearance of the human body. A spring-mass model based on a triangular mesh of the clothing model is finally constructed for dynamic simulation. The simulation result shows the feasibility and superiority of the method described.

  15. Team play in surgical education: a simulation-based study.

    Science.gov (United States)

    Marr, Mollie; Hemmert, Keith; Nguyen, Andrew H; Combs, Ronnie; Annamalai, Alagappan; Miller, George; Pachter, H Leon; Turner, James; Rifkind, Kenneth; Cohen, Steven M

    2012-01-01

    Simulation-based training provides a low-stress learning environment where real-life emergencies can be practiced. Simulation can improve surgical education and patient care in crisis situations through a team approach emphasizing interpersonal and communication skills. This study assessed the effects of simulation-based training in the context of trauma resuscitation in teams of trainees. In a New York State-certified level I trauma center, trauma alerts were assessed by a standardized video review process. Simulation training was provided in various trauma situations followed by a debriefing period. The outcomes measured included the number of healthcare workers involved in the resuscitation, the percentage of healthcare workers in role position, time to intubation, time to intubation from paralysis, time to obtain first imaging study, time to leave trauma bay for computed tomography scan or the operating room, presence of team leader, and presence of spinal stabilization. Thirty cases were video analyzed presimulation and postsimulation training. The two data sets were compared via a 1-sided t test for significance (p role positions increased from 57.8% to 83.6% (p = 0.46). The time to intubation from paralysis decreased from 3.9 to 2.8 minutes (p team leader increased from 64% to 90% (p team interaction and educational competencies. Providing simulation training as a tool for surgical education may enhance patient care. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  16. Current concepts in simulation-based trauma education.

    Science.gov (United States)

    Cherry, Robert A; Ali, Jameel

    2008-11-01

    The use of simulation-based technology in trauma education has focused on providing a safe and effective alternative to the more traditional methods that are used to teach technical skills and critical concepts in trauma resuscitation. Trauma team training using simulation-based technology is also being used to develop skills in leadership, team-information sharing, communication, and decision-making. The integration of simulators into medical student curriculum, residency training, and continuing medical education has been strongly recommended by the American College of Surgeons as an innovative means of enhancing patient safety, reducing medical errors, and performing a systematic evaluation of various competencies. Advanced human patient simulators are increasingly being used in trauma as an evaluation tool to assess clinical performance and to teach and reinforce essential knowledge, skills, and abilities. A number of specialty simulators in trauma and critical care have also been designed to meet these educational objectives. Ongoing educational research is still needed to validate long-term retention of knowledge and skills, provide reliable methods to evaluate teaching effectiveness and performance, and to demonstrate improvement in patient safety and overall quality of care.

  17. An FPGA-Based Massively Parallel Neuromorphic Cortex Simulator

    Directory of Open Access Journals (Sweden)

    Runchun M. Wang

    2018-04-01

    Full Text Available This paper presents a massively parallel and scalable neuromorphic cortex simulator designed for simulating large and structurally connected spiking neural networks, such as complex models of various areas of the cortex. The main novelty of this work is the abstraction of a neuromorphic architecture into clusters represented by minicolumns and hypercolumns, analogously to the fundamental structural units observed in neurobiology. Without this approach, simulating large-scale fully connected networks needs prohibitively large memory to store look-up tables for point-to-point connections. Instead, we use a novel architecture, based on the structural connectivity in the neocortex, such that all the required parameters and connections can be stored in on-chip memory. The cortex simulator can be easily reconfigured for simulating different neural networks without any change in hardware structure by programming the memory. A hierarchical communication scheme allows one neuron to have a fan-out of up to 200 k neurons. As a proof-of-concept, an implementation on one Altera Stratix V FPGA was able to simulate 20 million to 2.6 billion leaky-integrate-and-fire (LIF neurons in real time. We verified the system by emulating a simplified auditory cortex (with 100 million neurons. This cortex simulator achieved a low power dissipation of 1.62 μW per neuron. With the advent of commercially available FPGA boards, our system offers an accessible and scalable tool for the design, real-time simulation, and analysis of large-scale spiking neural networks.

  18. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  19. Shielding evaluation of neutron generator hall by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Pujala, U.; Selvakumaran, T.S.; Baskaran, R.; Venkatraman, B. [Radiological Safety Division, Indira Gandhi Center for Atomic Research, Kalpakkam (India); Thilagam, L.; Mohapatra, D.K., E-mail: swathythila2@yahoo.com [Safety Research Institute, Atomic Energy Regulatory Board, Kalpakkam (India)

    2017-04-01

    A shielded hall was constructed for accommodating a D-D, D-T or D-Be based pulsed neutron generator (NG) with 4π yield of 10{sup 9} n/s. The neutron shield design of the facility was optimized using NCRP-51 methodology such that the total dose rates outside the hall areas are well below the regulatory limit for full occupancy criterion (1 μSv/h). However, the total dose rates at roof top, cooling room trench exit and labyrinth exit were found to be above this limit for the optimized design. Hence, additional neutron shielding arrangements were proposed for cooling room trench and labyrinth exits. The roof top was made inaccessible. The present study is an attempt to evaluate the neutron and associated capture gamma transport through the bulk shields for the complete geometry and materials of the NG-Hall using Monte Carlo (MC) codes MCNP and FLUKA. The neutron source terms of D-D, D-T and D-Be reactions are considered in the simulations. The effect of additional shielding proposed has been demonstrated through the simulations carried out with the consideration of the additional shielding for D-Be neutron source term. The results MC simulations using two different codes are found to be consistent with each other for neutron dose rate estimates. However, deviation up to 28% is noted between these two codes at few locations for capture gamma dose rate estimates. Overall, the dose rates estimated by MC simulations including additional shields shows that all the locations surrounding the hall satisfy the full occupancy criteria for all three types of sources. Additionally, the dose rates due to direct transmission of primary neutrons estimated by FLUKA are compared with the values calculated using the formula given in NCRP-51 which shows deviations up to 50% with each other. The details of MC simulations and NCRP-51 methodology for the estimation of primary neutron dose rate along with the results are presented in this paper. (author)

  20. Towards an entropy-based detached-eddy simulation

    Science.gov (United States)

    Zhao, Rui; Yan, Chao; Li, XinLiang; Kong, WeiXuan

    2013-10-01

    A concept of entropy increment ratio ( s¯) is introduced for compressible turbulence simulation through a series of direct numerical simulations (DNS). s¯ represents the dissipation rate per unit mechanical energy with the benefit of independence of freestream Mach numbers. Based on this feature, we construct the shielding function f s to describe the boundary layer region and propose an entropy-based detached-eddy simulation method (SDES). This approach follows the spirit of delayed detached-eddy simulation (DDES) proposed by Spalart et al. in 2005, but it exhibits much better behavior after their performances are compared in the following flows, namely, pure attached flow with thick boundary layer (a supersonic flat-plate flow with high Reynolds number), fully separated flow (the supersonic base flow), and separated-reattached flow (the supersonic cavity-ramp flow). The Reynolds-averaged Navier-Stokes (RANS) resolved region is reliably preserved and the modeled stress depletion (MSD) phenomenon which is inherent in DES and DDES is partly alleviated. Moreover, this new hybrid strategy is simple and general, making it applicable to other models related to the boundary layer predictions.

  1. [Does simulator-based team training improve patient safety?].

    Science.gov (United States)

    Trentzsch, H; Urban, B; Sandmeyer, B; Hammer, T; Strohm, P C; Lazarovici, M

    2013-10-01

    Patient safety became paramount in medicine as well as in emergency medicine after it was recognized that preventable, adverse events significantly contributed to morbidity and mortality during hospital stay. The underlying errors cannot usually be explained by medical technical inadequacies only but are more due to difficulties in the transition of theoretical knowledge into tasks under the conditions of clinical reality. Crew Resource Management and Human Factors which determine safety and efficiency of humans in complex situations are suitable to control such sources of error. Simulation significantly improved safety in high reliability organizations, such as the aerospace industry.Thus, simulator-based team training has also been proposed for medical areas. As such training is consuming in cost, time and human resources, the question of the cost-benefit ratio obviously arises. This review outlines the effects of simulator-based team training on patient safety. Such course formats are not only capable of creating awareness and improvements in safety culture but also improve technical team performance and emphasize team performance as a clinical competence. A few studies even indicated improvement of patient-centered outcome, such as a reduced rate of adverse events but further studies are required in this respect. In summary, simulator-based team training should be accepted as a suitable strategy to improve patient safety.

  2. Simulation-based interpersonal communication skills training for neurosurgical residents.

    Science.gov (United States)

    Harnof, Sagi; Hadani, Moshe; Ziv, Amitai; Berkenstadt, Haim

    2013-09-01

    Communication skills are an important component of the neurosurgery residency training program. We developed a simulation-based training module for neurosurgery residents in which medical, communication and ethical dilemmas are presented by role-playing actors. To assess the first national simulation-based communication skills training for neurosurgical residents. Eight scenarios covering different aspects of neurosurgery were developed by our team: (1) obtaining informed consent for an elective surgery, (2) discharge of a patient following elective surgery, (3) dealing with an unsatisfied patient, (4) delivering news of intraoperative complications, (5) delivering news of a brain tumor to parents of a 5 year old boy, (6) delivering news of brain death to a family member, (7) obtaining informed consent for urgent surgery from the grandfather of a 7 year old boy with an epidural hematoma, and (8) dealing with a case of child abuse. Fifteen neurosurgery residents from all major medical centers in Israel participated in the training. The session was recorded on video and was followed by videotaped debriefing by a senior neurosurgeon and communication expert and by feedback questionnaires. All trainees participated in two scenarios and observed another two. Participants largely agreed that the actors simulating patients represented real patients and family members and that the videotaped debriefing contributed to the teaching of professional skills. Simulation-based communication skill training is effective, and together with thorough debriefing is an excellent learning and practical method for imparting communication skills to neurosurgery residents. Such simulation-based training will ultimately be part of the national residency program.

  3. Simulation-based training in brain death determination.

    Science.gov (United States)

    MacDougall, Benjamin J; Robinson, Jennifer D; Kappus, Liana; Sudikoff, Stephanie N; Greer, David M

    2014-12-01

    Despite straightforward guidelines on brain death determination by the American Academy of Neurology (AAN), substantial practice variability exists internationally, between states, and among institutions. We created a simulation-based training course on proper determination based on the AAN practice parameters to address and assess knowledge and practice gaps at our institution. Our intervention consisted of a didactic course and a simulation exercise, and was bookended by before and after multiple-choice tests. The 40-min didactic course, including a video demonstration, covered all aspects of the brain death examination. Simulation sessions utilized a SimMan 3G manikin and involved a complete examination, including an apnea test. Possible confounders and signs incompatible with brain death were embedded throughout. Facilitators evaluated performance with a 26-point checklist based on the most recent AAN guidelines. A senior neurologist conducted all aspects of the course, including the didactic session, simulation, and debriefing session. Ninety physicians from multiple specialties have participated in the didactic session, 38 of whom have completed the simulation. Pre-test scores were poor (41.4 %), with attendings scoring higher than residents (46.6 vs. 40.4 %, p = 0.07), and neurologists and neurosurgeons significantly outperforming other specialists (53.9 vs. 38.9 %, p = 0.003). Post-test scores (73.3 %) were notably higher than pre-test scores (45.4 %). Participant feedback has been uniformly positive. Baseline knowledge of brain death determination among providers was low but improved greatly after the course. Our intervention represents an effective model that can be replicated at other institutions to train clinicians in the determination of brain death according to evidence-based guidelines.

  4. AUV-Based Plume Tracking: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Awantha Jayasiri

    2016-01-01

    Full Text Available This paper presents a simulation study of an autonomous underwater vehicle (AUV navigation system operating in a GPS-denied environment. The AUV navigation method makes use of underwater transponder positioning and requires only one transponder. A multirate unscented Kalman filter is used to determine the AUV orientation and position by fusing high-rate sensor data and low-rate information. The paper also proposes a gradient-based, efficient, and adaptive novel algorithm for plume boundary tracking missions. The algorithm follows a centralized approach and it includes path optimization features based on gradient information. The proposed algorithm is implemented in simulation on the AUV-based navigation system and successful boundary tracking results are obtained.

  5. Simulation-based Strategies for Smart Demand Response

    Directory of Open Access Journals (Sweden)

    Ines Leobner

    2018-03-01

    Full Text Available Demand Response can be seen as one effective way to harmonize demand and supply in order to achieve high self-coverage of energy consumption by means of renewable energy sources. This paper presents two different simulation-based concepts to integrate demand-response strategies into energy management systems in the customer domain of the Smart Grid. The first approach is a Model Predictive Control of the heating and cooling system of a low-energy office building. The second concept aims at industrial Demand Side Management by integrating energy use optimization into industrial automation systems. Both approaches are targeted at day-ahead planning. Furthermore, insights gained into the implications of the concepts onto the design of the model, simulation and optimization will be discussed. While both approaches share a similar architecture, different modelling and simulation approaches were required by the use cases.

  6. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  7. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    Science.gov (United States)

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  8. A Simulation Base Investigation of High Latency Space Systems Operations

    Science.gov (United States)

    Li, Zu Qun; Crues, Edwin Z.; Bielski, Paul; Moore, Michael

    2017-01-01

    NASA's human space program has developed considerable experience with near Earth space operations. Although NASA has experience with deep space robotic missions, NASA has little substantive experience with human deep space operations. Even in the Apollo program, the missions lasted only a few weeks and the communication latencies were on the order of seconds. Human missions beyond the relatively close confines of the Earth-Moon system will involve missions with durations measured in months and communications latencies measured in minutes. To minimize crew risk and to maximize mission success, NASA needs to develop a better understanding of the implications of these types of mission durations and communication latencies on vehicle design, mission design and flight controller interaction with the crew. To begin to address these needs, NASA performed a study using a physics-based subsystem simulation to investigate the interactions between spacecraft crew and a ground-based mission control center for vehicle subsystem operations across long communication delays. The simulation, built with a subsystem modeling tool developed at NASA's Johnson Space Center, models the life support system of a Mars transit vehicle. The simulation contains models of the cabin atmosphere and pressure control system, electrical power system, drinking and waste water systems, internal and external thermal control systems, and crew metabolic functions. The simulation has three interfaces: 1) a real-time crew interface that can be use to monitor and control the vehicle subsystems; 2) a mission control center interface with data transport delays up to 15 minutes each way; 3) a real-time simulation test conductor interface that can be use to insert subsystem malfunctions and observe the interactions between the crew, ground, and simulated vehicle. The study was conducted at the 21st NASA Extreme Environment Mission Operations (NEEMO) mission between July 18th and Aug 3rd of year 2016. The NEEMO

  9. Simulation-based ureteroscopy training: a systematic review.

    Science.gov (United States)

    Brunckhorst, Oliver; Aydin, Abdullatif; Abboudi, Hamid; Sahai, Arun; Khan, Muhammad Shamim; Dasgupta, Prokar; Ahmed, Kamran

    2015-01-01

    Simulation is a common adjunct to operative training and various modalities exist for ureteroscopy. This systematic review aims the following: (1) to identify available ureteroscopy simulators, (2) to explore evidence for their effectiveness using characteristic criterion, and (3) to provide recommendations for simulation-based ureteroscopy training. The preferred reporting items for systematic reviews and meta-analysis statement guidelines were used. A literature search was performed using the PubMed, EMBASE, and Cochrane Library databases. In total, 20 articles concerning ureteroscopy simulators were included. Overall, 3 high-fidelity bench models are available. The Uro-Scopic Trainer has demonstrated face, construct, and concurrent validity, whereas the Scope Trainer has undergone content, construct, and predictive validation. The adult ureteroscopy trainer has demonstrated face, content, and construct validity. The URO Mentor is the only available ureteroscopy virtual-reality system; 10 studies were identified demonstrating its face, content, construct, concurrent, and predictive validity. The Uro-Scopic Trainer, the Scope Trainer, and the URO Mentor have demonstrated high educational impact. A noncommercially available, low-fidelity model has demonstrated effectiveness comparable to its high-fidelity counterpart at 185 times lesser than the price of the Uro-Scopic Trainer. The use of porcine models has also been described in 3 studies but require further study. Valid models are available for simulation-based ureteroscopy training. However, there is a lack of many high-level studies conducted, and further investigation is required in this area. Furthermore, current research focuses on the technical skills acquisition with little research conducted on nontechnical skills acquisition within ureteroscopy. The next step for ureteroscopy training is a formalized and validated curriculum, incorporating simulation, training models, development of nontechnical skills

  10. GPU based numerical simulation of core shooting process

    Directory of Open Access Journals (Sweden)

    Yi-zhong Zhang

    2017-11-01

    Full Text Available Core shooting process is the most widely used technique to make sand cores and it plays an important role in the quality of sand cores. Although numerical simulation can hopefully optimize the core shooting process, research on numerical simulation of the core shooting process is very limited. Based on a two-fluid model (TFM and a kinetic-friction constitutive correlation, a program for 3D numerical simulation of the core shooting process has been developed and achieved good agreements with in-situ experiments. To match the needs of engineering applications, a graphics processing unit (GPU has also been used to improve the calculation efficiency. The parallel algorithm based on the Compute Unified Device Architecture (CUDA platform can significantly decrease computing time by multi-threaded GPU. In this work, the program accelerated by CUDA parallelization method was developed and the accuracy of the calculations was ensured by comparing with in-situ experimental results photographed by a high-speed camera. The design and optimization of the parallel algorithm were discussed. The simulation result of a sand core test-piece indicated the improvement of the calculation efficiency by GPU. The developed program has also been validated by in-situ experiments with a transparent core-box, a high-speed camera, and a pressure measuring system. The computing time of the parallel program was reduced by nearly 95% while the simulation result was still quite consistent with experimental data. The GPU parallelization method can successfully solve the problem of low computational efficiency of the 3D sand shooting simulation program, and thus the developed GPU program is appropriate for engineering applications.

  11. A virtual reality based simulator for learning nasogastric tube placement.

    Science.gov (United States)

    Choi, Kup-Sze; He, Xuejian; Chiang, Vico Chung-Lim; Deng, Zhaohong

    2015-02-01

    Nasogastric tube (NGT) placement is a common clinical procedure where a plastic tube is inserted into the stomach through the nostril for feeding or drainage. However, the placement is a blind process in which the tube may be mistakenly inserted into other locations, leading to unexpected complications or fatal incidents. The placement techniques are conventionally acquired by practising on unrealistic rubber mannequins or on humans. In this paper, a virtual reality based training simulation system is proposed to facilitate the training of NGT placement. It focuses on the simulation of tube insertion and the rendering of the feedback forces with a haptic device. A hybrid force model is developed to compute the forces analytically or numerically under different conditions, including the situations when the patient is swallowing or when the tube is buckled at the nostril. To ensure real-time interactive simulations, an offline simulation approach is adopted to obtain the relationship between the insertion depth and insertion force using a non-linear finite element method. The offline dataset is then used to generate real-time feedback forces by interpolation. The virtual training process is logged quantitatively with metrics that can be used for assessing objective performance and tracking progress. The system has been evaluated by nursing professionals. They found that the haptic feeling produced by the simulated forces is similar to their experience during real NGT insertion. The proposed system provides a new educational tool to enhance conventional training in NGT placement. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. An introduction to statistical computing a simulation-based approach

    CERN Document Server

    Voss, Jochen

    2014-01-01

    A comprehensive introduction to sampling-based methods in statistical computing The use of computers in mathematics and statistics has opened up a wide range of techniques for studying otherwise intractable problems.  Sampling-based simulation techniques are now an invaluable tool for exploring statistical models.  This book gives a comprehensive introduction to the exciting area of sampling-based methods. An Introduction to Statistical Computing introduces the classical topics of random number generation and Monte Carlo methods.  It also includes some advanced met

  13. Simulation-Based Abdominal Ultrasound Training – A Systematic Review

    DEFF Research Database (Denmark)

    Østergaard, Mia L.; Ewertsen, Caroline; Konge, Lars

    2016-01-01

    of Science, and the Cochrane Library was searched. Articles were divided into three categories based on study design (randomized controlled trials, before-and-after studies and descriptive studies) and assessed for level of evidence using the Oxford Centre for Evidence Based Medicine (OCEBM) system......PURPOSE: The aim is to provide a complete overview of the different simulation-based training options for abdominal ultrasound and to explore the evidence of their effect. MATERIALS AND METHODS: This systematic review was performed according to the PRISMA guidelines and Medline, Embase, Web...

  14. An Individual-based Probabilistic Model for Fish Stock Simulation

    Directory of Open Access Journals (Sweden)

    Federico Buti

    2010-08-01

    Full Text Available We define an individual-based probabilistic model of a sole (Solea solea behaviour. The individual model is given in terms of an Extended Probabilistic Discrete Timed Automaton (EPDTA, a new formalism that is introduced in the paper and that is shown to be interpretable as a Markov decision process. A given EPDTA model can be probabilistically model-checked by giving a suitable translation into syntax accepted by existing model-checkers. In order to simulate the dynamics of a given population of soles in different environmental scenarios, an agent-based simulation environment is defined in which each agent implements the behaviour of the given EPDTA model. By varying the probabilities and the characteristic functions embedded in the EPDTA model it is possible to represent different scenarios and to tune the model itself by comparing the results of the simulations with real data about the sole stock in the North Adriatic sea, available from the recent project SoleMon. The simulator is presented and made available for its adaptation to other species.

  15. Simulating classroom lessons : an agent-based attempt

    OpenAIRE

    Ingram, Fred; Brooks, Roger John

    2018-01-01

    This is an interim report on a project to construct an agent-based simulation that reproduces some of the interactions between students and their teacher in classroom lessons. In a pilot study, the activities of 67 students and 7 teachers during 40 lessons were recorded using a data collection instrument that currently captures 17 student states and 15 teacher states. These data enabled various conceptual models to be explored, providing empirical values and distributions for the model parame...

  16. Invariance and universality in social agent-based simulations

    Science.gov (United States)

    Cioffi-Revilla, Claudio

    2002-01-01

    Agent-based simulation models have a promising future in the social sciences, from political science to anthropology, economics, and sociology. To realize their full scientific potential, however, these models must address a set of key problems, such as the number of interacting agents and their geometry, network topology, time calibration, phenomenological calibration, structural stability, power laws, and other substantive and methodological issues. This paper discusses and highlights these problems and outlines some solutions. PMID:12011412

  17. Designing and simulating a nitinol-based micro ejector

    Directory of Open Access Journals (Sweden)

    Yesid Mora Sierra

    2012-01-01

    Full Text Available This paper describes pico-droplet ejector design and simulation. The actuation system was based on two interconnected nitinol membranes’ shape memory effect. Ejected volume was 12pL and it operated at 30°C to 64°C. Ejecting excitation voltage was 12V and the ejecting energy required by actuator operation was 26µJ per drop. These pico-liter ejectors could have applications in making, lubricating and cooling integrated circuits.

  18. Enhancing food engineering education with interactive web-based simulations

    OpenAIRE

    Alexandros Koulouris; Georgios Aroutidis; Dimitrios Vardalis; Petros Giannoulis; Paraskevi Karakosta

    2015-01-01

    In the traditional deductive approach in teaching any engineering topic, teachers would first expose students to the derivation of the equations that govern the behavior of a physical system and then demonstrate the use of equations through a limited number of textbook examples. This methodology, however, is rarely adequate to unmask the cause-effect and quantitative relationships between the system variables that the equations embody. Web-based simulation, which is the integration of simulat...

  19. Beam-based Feedback Simulations for the NLC Linac

    International Nuclear Information System (INIS)

    Hendrickson, Linda

    2000-01-01

    Extensive beam-based feedback systems are planned as an integral part of the Next Linear Collider (NLC) control system. Wakefield effects are a significant influence on the feedback design, imposing both architectural and algorithmic constraints. Studies are in progress to assure the optimal selection of devices and to refine and confirm the algorithms for the system design. The authors show the results of initial simulations, along with evaluations of system response for various conditions of ground motion and other operational disturbances

  20. A Bacterial-Based Algorithm to Simulate Complex Adaptative Systems

    OpenAIRE

    González Rodríguez, Diego; Hernández Carrión, José Rodolfo

    2014-01-01

    Paper presented at the 13th International Conference on Simulation of Adaptive Behavior which took place at Castellón, Spain in 2014, July 22-25. Bacteria have demonstrated an amazing capacity to overcome envi-ronmental changes by collective adaptation through genetic exchanges. Using a distributed communication system and sharing individual strategies, bacteria propagate mutations as innovations that allow them to survive in different envi-ronments. In this paper we present an agent-based...

  1. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  2. Concept of operator support system based on cognitive simulation

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Takano, Kenichi

    1999-01-01

    Hazardous technologies such chemical plants, nuclear power plants, etc. have introduced multi-layered defenses to prevent accidents. One of those defenses is experienced operators in control rooms. Once an abnormal condition occurs, they are the front line people to cope with it. Therefore, operators' quick recognition of the plant conditions and fast decision making on responses are quite important for trouble shooting. In order to help operators to deal with abnormalities in process plants, lots of efforts had been done to develop operator support systems since early 1980s (IAEA, 1993). However, the boom in developing operator support systems has slumped due to the limitations of knowledge engineering, artificial knowledge, etc (Yamamoto, 1998). The limitations had also biased the focus of the system development to abnormality detection, root cause diagnosis, etc (Hajek, Hashemi, Sharma and Chandrasekaran, 1986). Information or guidance about future plant behavior and strategies/tactics to deal with abnormal events are important and helpful for operators but researches and development of those systems made a belated start. Before developing these kinds of system, it is essential to understand how operators deal with abnormalities. CRIEPI has been conducting a project to develop a computer system that simulates behavior of operators dealing with abnormal operating conditions in a nuclear power plant. This project had two stages. In the first stage, the authors developed a prototype system that simulates behavior of a team facing abnormal events in a very simplified power plant (Sasou, Takano and Yoshimura, 1995). In the second stage, the authors applied the simulation technique developed in the first stage to construct a system to simulate a team's behavior in a nuclear power plant. This paper briefly summarizes the simulation system developed in the second stage, main mechanism for the simulation and the concept of an operator support system based on this

  3. Simulation-based artifact correction (SBAC) for metrological computed tomography

    Science.gov (United States)

    Maier, Joscha; Leinweber, Carsten; Sawall, Stefan; Stoschus, Henning; Ballach, Frederic; Müller, Tobias; Hammer, Michael; Christoph, Ralf; Kachelrieß, Marc

    2017-06-01

    Computed tomography (CT) is a valuable tool for the metrolocical assessment of industrial components. However, the application of CT to the investigation of highly attenuating objects or multi-material components is often restricted by the presence of CT artifacts caused by beam hardening, x-ray scatter, off-focal radiation, partial volume effects or the cone-beam reconstruction itself. In order to overcome this limitation, this paper proposes an approach to calculate a correction term that compensates for the contribution of artifacts and thus enables an appropriate assessment of these components using CT. Therefore, we make use of computer simulations of the CT measurement process. Based on an appropriate model of the object, e.g. an initial reconstruction or a CAD model, two simulations are carried out. One simulation considers all physical effects that cause artifacts using dedicated analytic methods as well as Monte Carlo-based models. The other one represents an ideal CT measurement i.e. a measurement in parallel beam geometry with a monochromatic, point-like x-ray source and no x-ray scattering. Thus, the difference between these simulations is an estimate for the present artifacts and can be used to correct the acquired projection data or the corresponding CT reconstruction, respectively. The performance of the proposed approach is evaluated using simulated as well as measured data of single and multi-material components. Our approach yields CT reconstructions that are nearly free of artifacts and thereby clearly outperforms commonly used artifact reduction algorithms in terms of image quality. A comparison against tactile reference measurements demonstrates the ability of the proposed approach to increase the accuracy of the metrological assessment significantly.

  4. Parallel PDE-Based Simulations Using the Common Component Architecture

    International Nuclear Information System (INIS)

    McInnes, Lois C.; Allan, Benjamin A.; Armstrong, Robert; Benson, Steven J.; Bernholdt, David E.; Dahlgren, Tamara L.; Diachin, Lori; Krishnan, Manoj Kumar; Kohl, James A.; Larson, J. Walter; Lefantzi, Sophia; Nieplocha, Jarek; Norris, Boyana; Parker, Steven G.; Ray, Jaideep; Zhou, Shujia

    2006-01-01

    The complexity of parallel PDE-based simulations continues to increase as multimodel, multiphysics, and multi-institutional projects become widespread. A goal of component based software engineering in such large-scale simulations is to help manage this complexity by enabling better interoperability among various codes that have been independently developed by different groups. The Common Component Architecture (CCA) Forum is defining a component architecture specification to address the challenges of high-performance scientific computing. In addition, several execution frameworks, supporting infrastructure, and general purpose components are being developed. Furthermore, this group is collaborating with others in the high-performance computing community to design suites of domain-specific component interface specifications and underlying implementations. This chapter discusses recent work on leveraging these CCA efforts in parallel PDE-based simulations involving accelerator design, climate modeling, combustion, and accidental fires and explosions. We explain how component technology helps to address the different challenges posed by each of these applications, and we highlight how component interfaces built on existing parallel toolkits facilitate the reuse of software for parallel mesh manipulation, discretization, linear algebra, integration, optimization, and parallel data redistribution. We also present performance data to demonstrate the suitability of this approach, and we discuss strategies for applying component technologies to both new and existing applications

  5. COEL: A Cloud-based Reaction Network Simulator

    Directory of Open Access Journals (Sweden)

    Peter eBanda

    2016-04-01

    Full Text Available Chemical Reaction Networks (CRNs are a formalism to describe the macroscopic behavior of chemical systems. We introduce COEL, a web- and cloud-based CRN simulation framework that does not require a local installation, runs simulations on a large computational grid, provides reliable database storage, and offers a visually pleasing and intuitive user interface. We present an overview of the underlying software, the technologies, and the main architectural approaches employed. Some of COEL's key features include ODE-based simulations of CRNs and multicompartment reaction networks with rich interaction options, a built-in plotting engine, automatic DNA-strand displacement transformation and visualization, SBML/Octave/Matlab export, and a built-in genetic-algorithm-based optimization toolbox for rate constants.COEL is an open-source project hosted on GitHub (http://dx.doi.org/10.5281/zenodo.46544, which allows interested research groups to deploy it on their own sever. Regular users can simply use the web instance at no cost at http://coel-sim.org. The framework is ideally suited for a collaborative use in both research and education.

  6. A Rules-Based Simulation of Bacterial Turbulence

    Science.gov (United States)

    Mikel-Stites, Maxwell; Staples, Anne

    2015-11-01

    In sufficiently dense bacterial populations (>40% bacteria by volume), unusual collective swimming behaviors have been consistently observed, resembling von Karman vortex streets. The source of these collective swimming behavior has yet to be fully determined, and as of yet, no research has been conducted that would define whether or not this behavior is derived predominantly from the properties of the surrounding media, or if it is an emergent behavior as a result of the ``rules'' governing the behavior of individual bacteria. The goal of this research is to ascertain whether or not it is possible to design a simulation that can replicate the qualitative behavior of the densely packed bacterial populations using only behavioral rules to govern the actions of each bacteria, with the physical properties of the media being neglected. The results of the simulation will address whether or not it is possible for the system's overall behavior to be driven exclusively by these rule-based dynamics. In order to examine this, the behavioral simulation was written in MATLAB on a fixed grid, and updated sequentially with the bacterial behavior, including randomized tumbling, gathering and perceptual sub-functions. If the simulation is successful, it will serve as confirmation that it is possible to generate these qualitatively vortex-like behaviors without specific physical media (that the phenomena arises in emergent fashion from behavioral rules), or as evidence that the observed behavior requires some specific set of physical parameters.

  7. Simulation-based seismic loss estimation of seaport transportation system

    International Nuclear Information System (INIS)

    Ung Jin Na; Shinozuka, Masanobu

    2009-01-01

    Seaport transportation system is one of the major lifeline systems in modern society and its reliable operation is crucial for the well-being of the public. However, past experiences showed that earthquake damage to port components can severely disrupt terminal operation, and thus negatively impact on the regional economy. The main purpose of this study is to provide a methodology for estimating the effects of the earthquake on the performance of the operation system of a container terminal in seaports. To evaluate the economic loss of damaged system, an analytical framework is developed by integrating simulation models for terminal operation and fragility curves of port components in the context of seismic risk analysis. For this purpose, computerized simulation model is developed and verified with actual terminal operation records. Based on the analytical procedure to assess the seismic performance of the terminal, system fragility curves are also developed. This simulation-based loss estimation methodology can be used not only for estimating the seismically induced revenue loss but also serve as a decision-making tool to select specific seismic retrofit technique on the basis of benefit-cost analysis

  8. Flat Knitting Loop Deformation Simulation Based on Interlacing Point Model

    Directory of Open Access Journals (Sweden)

    Jiang Gaoming

    2017-12-01

    Full Text Available In order to create realistic loop primitives suitable for the faster CAD of the flat-knitted fabric, we have performed research on the model of the loop as well as the variation of the loop surface. This paper proposes an interlacing point-based model for the loop center curve, and uses the cubic Bezier curve to fit the central curve of the regular loop, elongated loop, transfer loop, and irregular deformed loop. In this way, a general model for the central curve of the deformed loop is obtained. The obtained model is then utilized to perform texture mapping, texture interpolation, and brightness processing, simulating a clearly structured and lifelike deformed loop. The computer program LOOP is developed by using the algorithm. The deformed loop is simulated with different yarns, and the deformed loop is applied to design of a cable stitch, demonstrating feasibility of the proposed algorithm. This paper provides a loop primitive simulation method characterized by lifelikeness, yarn material variability, and deformation flexibility, and facilitates the loop-based fast computer-aided design (CAD of the knitted fabric.

  9. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  10. A sEMG model with experimentally based simulation parameters.

    Science.gov (United States)

    Wheeler, Katherine A; Shimada, Hiroshima; Kumar, Dinesh K; Arjunan, Sridhar P

    2010-01-01

    A differential, time-invariant, surface electromyogram (sEMG) model has been implemented. While it is based on existing EMG models, the novelty of this implementation is that it assigns more accurate distributions of variables to create realistic motor unit (MU) characteristics. Variables such as muscle fibre conduction velocity, jitter (the change in the interpulse interval between subsequent action potential firings) and motor unit size have been considered to follow normal distributions about an experimentally obtained mean. In addition, motor unit firing frequencies have been considered to have non-linear and type based distributions that are in accordance with experimental results. Motor unit recruitment thresholds have been considered to be related to the MU type. The model has been used to simulate single channel differential sEMG signals from voluntary, isometric contractions of the biceps brachii muscle. The model has been experimentally verified by conducting experiments on three subjects. Comparison between simulated signals and experimental recordings shows that the Root Mean Square (RMS) increases linearly with force in both cases. The simulated signals also show similar values and rates of change of RMS to the experimental signals.

  11. Research on facial expression simulation based on depth image

    Science.gov (United States)

    Ding, Sha-sha; Duan, Jin; Zhao, Yi-wu; Xiao, Bo; Wang, Hao

    2017-11-01

    Nowadays, face expression simulation is widely used in film and television special effects, human-computer interaction and many other fields. Facial expression is captured by the device of Kinect camera .The method of AAM algorithm based on statistical information is employed to detect and track faces. The 2D regression algorithm is applied to align the feature points. Among them, facial feature points are detected automatically and 3D cartoon model feature points are signed artificially. The aligned feature points are mapped by keyframe techniques. In order to improve the animation effect, Non-feature points are interpolated based on empirical models. Under the constraint of Bézier curves we finish the mapping and interpolation. Thus the feature points on the cartoon face model can be driven if the facial expression varies. In this way the purpose of cartoon face expression simulation in real-time is came ture. The experiment result shows that the method proposed in this text can accurately simulate the facial expression. Finally, our method is compared with the previous method. Actual data prove that the implementation efficiency is greatly improved by our method.

  12. Airway management education: simulation based training versus non-simulation based training-A systematic review and meta-analyses.

    Science.gov (United States)

    Sun, Yanxia; Pan, Chuxiong; Li, Tianzuo; Gan, Tong J

    2017-02-01

    Simulation-based training (SBT) has become a standard for medical education. However, the efficacy of simulation based training in airway management education remains unclear. The aim of this study was to evaluate all published evidence comparing the effectiveness of SBT for airway management versus non-simulation based training (NSBT) on learner and patient outcomes. Systematic review with meta-analyses were used. Data were derived from PubMed, EMBASE, CINAHL, Scopus, the Cochrane Controlled Trials Register and Cochrane Database of Systematic Reviews from inception to May 2016. Published comparative trials that evaluated the effect of SBT on airway management training in compared with NSBT were considered. The effect sizes with 95% confidence intervals (CI) were calculated for outcomes measures. Seventeen eligible studies were included. SBT was associated with improved behavior performance [standardized mean difference (SMD):0.30, 95% CI: 0.06 to 0.54] in comparison with NSBT. However, the benefits of SBT were not seen in time-skill (SMD:-0.13, 95% CI: -0.82 to 0.52), written examination score (SMD: 0.39, 95% CI: -0.09 to 0.86) and success rate of procedure completion on patients [relative risk (RR): 1.26, 95% CI: 0.96 to 1.66]. SBT may be not superior to NSBT on airway management training.

  13. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  14. Monte Carlo simulation of neutron detection efficiency for NE213 scintillation detector

    International Nuclear Information System (INIS)

    Xi Yinyin; Song Yushou; Chen Zhiqiang; Yang Kun; Zhangsu Yalatu; Liu Xingquan

    2013-01-01

    A NE213 liquid scintillation neutron detector was simulated by using the FLUKA code. The light output of the detector was obtained by transforming the secondary particles energy deposition using Birks formula. According to the measurement threshold, detection efficiencies can be calculated by integrating the light output. The light output, central efficiency and the average efficiency as a function of the front surface radius of the detector, were simulated and the results agreed well with experimental results. (authors)

  15. Simulation based virtual learning environment in medical genetics counseling

    DEFF Research Database (Denmark)

    Makransky, Guido; Bonde, Mads T; Wulff, Julie S G

    2016-01-01

    learning environments increase students' knowledge, intrinsic motivation, and self-efficacy, and help them generalize from laboratory analyses to clinical practice and health decision-making. METHODS: An entire class of 300 University of Copenhagen first-year undergraduate students, most with a major...... in medicine, received a 2-h training session in a simulation based learning environment. The main outcomes were pre- to post- changes in knowledge, intrinsic motivation, and self-efficacy, together with post-intervention evaluation of the effect of the simulation on student understanding of everyday clinical...... practice were demonstrated. RESULTS: Knowledge (Cohen's d = 0.73), intrinsic motivation (d = 0.24), and self-efficacy (d = 0.46) significantly increased from the pre- to post-test. Low knowledge students showed the greatest increases in knowledge (d = 3.35) and self-efficacy (d = 0.61), but a non...

  16. Ergonomics and simulation-based approach in improving facility layout

    Science.gov (United States)

    Abad, Jocelyn D.

    2018-02-01

    The use of the simulation-based technique in facility layout has been a choice in the industry due to its convenience and efficient generation of results. Nevertheless, the solutions generated are not capable of addressing delays due to worker's health and safety which significantly impact overall operational efficiency. It is, therefore, critical to incorporate ergonomics in facility design. In this study, workstation analysis was incorporated into Promodel simulation to improve the facility layout of a garment manufacturing. To test the effectiveness of the method, existing and improved facility designs were measured using comprehensive risk level, efficiency, and productivity. Results indicated that the improved facility layout generated a decrease in comprehensive risk level and rapid upper limb assessment score; an increase of 78% in efficiency and 194% increase in productivity compared to existing design and thus proved that the approach is effective in attaining overall facility design improvement.

  17. A web-based repository of surgical simulator projects.

    Science.gov (United States)

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  18. Model-based microwave image reconstruction: simulations and experiments

    International Nuclear Information System (INIS)

    Ciocan, Razvan; Jiang Huabei

    2004-01-01

    We describe an integrated microwave imaging system that can provide spatial maps of dielectric properties of heterogeneous media with tomographically collected data. The hardware system (800-1200 MHz) was built based on a lock-in amplifier with 16 fixed antennas. The reconstruction algorithm was implemented using a Newton iterative method with combined Marquardt-Tikhonov regularizations. System performance was evaluated using heterogeneous media mimicking human breast tissue. Finite element method coupled with the Bayliss and Turkel radiation boundary conditions were applied to compute the electric field distribution in the heterogeneous media of interest. The results show that inclusions embedded in a 76-diameter background medium can be quantitatively reconstructed from both simulated and experimental data. Quantitative analysis of the microwave images obtained suggests that an inclusion of 14 mm in diameter is the smallest object that can be fully characterized presently using experimental data, while objects as small as 10 mm in diameter can be quantitatively resolved with simulated data

  19. Internet Based Simulations of Debris Dispersion of Shuttle Launch

    Science.gov (United States)

    Bardina, Jorge; Thirumalainambi, Rajkumar

    2004-01-01

    The debris dispersion model (which dispersion model?) is so heterogeneous and interrelated with various factors, 3D graphics combined with physical models are useful in understanding the complexity of launch and range operations. Modeling and simulation in this area mainly focuses on orbital dynamics and range safety concepts, including destruct limits, telemetry and tracking, and population risk. Particle explosion modeling is the process of simulating an explosion by breaking the rocket into many pieces. The particles are scattered throughout their motion using the laws of physics eventually coming to rest. The size of the foot print explains the type of explosion and distribution of the particles. The shuttle launch and range operations in this paper are discussed based on the operations of the Kennedy Space Center, Florida, USA. Java 3D graphics provides geometric and visual content with suitable modeling behaviors of Shuttle launches.

  20. A pedagogical model for simulation-based learning in healthcare

    Directory of Open Access Journals (Sweden)

    Tuulikki Keskitalo

    2015-11-01

    Full Text Available The aim of this study was to design a pedagogical model for a simulation-based learning environment (SBLE in healthcare. Currently, simulation and virtual reality are a major focus in healthcare education. However, when and how these learning environments should be applied is not well-known. The present study tries to fill that gap. We pose the following research question: What kind of pedagogical model supports and facilitates students’ meaningful learning in SBLEs? The study used design-based research (DBR and case study approaches. We report the results from our second case study and how the pedagogical model was developed based on the lessons learned. The study involved nine facilitators and 25 students. Data were collected and analysed using mixed methods. The main result of this study is the refined pedagogical model. The model is based on the socio-cultural theory of learning and characteristics of meaningful learning as well as previous pedagogical models. The model will provide a more holistic and meaningful approach to teaching and learning in SBLEs. However, the model requires evidence and further development.

  1. Simulating the operation of photosensor-based lighting controls

    International Nuclear Information System (INIS)

    Ehrlich, Charles; Papamichael, Konstantinos; Lai, Judy; Revzan, Kenneth

    2001-01-01

    Energy savings from the use of daylighting in commercial buildings are realized through implementation of photoelectric lighting controls that dim electric lights when sufficient daylight is available to provide adequate workplane illumination. The dimming level of electric lighting is based on the signal of a photosensor. Current simulation approaches for such systems are based on the questionable assumption that the signal of the photosensor is proportional to the task illuminance. This paper presents a method that simulates the performance of photosensor controls considering the acceptance angle, angular sensitivity, placement of the photosensor within a space, and color correction filter. The method is based on the multiplication of two fisheye images: one generated from the angular sensitivity of the photosensor and the other from a 180- or 360-degree fisheye image of the space as ''seen'' by the photosensor. The paper includes a detailed description of the method and its implementation, example applications, and validation results based on comparison with measurements in an actual office space

  2. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  3. Pump-stopping water hammer simulation based on RELAP5

    International Nuclear Information System (INIS)

    Yi, W S; Jiang, J; Li, D D; Lan, G; Zhao, Z

    2013-01-01

    RELAP5 was originally designed to analyze complex thermal-hydraulic interactions that occur during either postulated large or small loss-of-coolant accidents in PWRs. However, as development continued, the code was expanded to include many of the transient scenarios that might occur in thermal-hydraulic systems. The fast deceleration of the liquid results in high pressure surges, thus the kinetic energy is transformed into the potential energy, which leads to the temporary pressure increase. This phenomenon is called water hammer. Generally water hammer can occur in any thermal-hydraulic systems and it is extremely dangerous for the system when the pressure surges become considerably high. If this happens and when the pressure exceeds the critical pressure that the pipe or the fittings along the pipeline can burden, it will result in the failure of the whole pipeline integrity. The purpose of this article is to introduce the RELAP5 to the simulation and analysis of water hammer situations. Based on the knowledge of the RELAP5 code manuals and some relative documents, the authors utilize RELAP5 to set up an example of water-supply system via an impeller pump to simulate the phenomena of the pump-stopping water hammer. By the simulation of the sample case and the subsequent analysis of the results that the code has provided, we can have a better understand of the knowledge of water hammer as well as the quality of the RELAP5 code when it's used in the water-hammer fields. In the meantime, By comparing the results of the RELAP5 based model with that of other fluid-transient analysis software say, PIPENET. The authors make some conclusions about the peculiarity of RELAP5 when transplanted into water-hammer research and offer several modelling tips when use the code to simulate a water-hammer related case

  4. Pump-stopping water hammer simulation based on RELAP5

    Science.gov (United States)

    Yi, W. S.; Jiang, J.; Li, D. D.; Lan, G.; Zhao, Z.

    2013-12-01

    RELAP5 was originally designed to analyze complex thermal-hydraulic interactions that occur during either postulated large or small loss-of-coolant accidents in PWRs. However, as development continued, the code was expanded to include many of the transient scenarios that might occur in thermal-hydraulic systems. The fast deceleration of the liquid results in high pressure surges, thus the kinetic energy is transformed into the potential energy, which leads to the temporary pressure increase. This phenomenon is called water hammer. Generally water hammer can occur in any thermal-hydraulic systems and it is extremely dangerous for the system when the pressure surges become considerably high. If this happens and when the pressure exceeds the critical pressure that the pipe or the fittings along the pipeline can burden, it will result in the failure of the whole pipeline integrity. The purpose of this article is to introduce the RELAP5 to the simulation and analysis of water hammer situations. Based on the knowledge of the RELAP5 code manuals and some relative documents, the authors utilize RELAP5 to set up an example of water-supply system via an impeller pump to simulate the phenomena of the pump-stopping water hammer. By the simulation of the sample case and the subsequent analysis of the results that the code has provided, we can have a better understand of the knowledge of water hammer as well as the quality of the RELAP5 code when it's used in the water-hammer fields. In the meantime, By comparing the results of the RELAP5 based model with that of other fluid-transient analysis software say, PIPENET. The authors make some conclusions about the peculiarity of RELAP5 when transplanted into water-hammer research and offer several modelling tips when use the code to simulate a water-hammer related case.

  5. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Paul-Eric DOSSOU

    2013-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  6. Application potential of Agent Based Simulation and Discrete Event Simulation in Enterprise integration modelling concepts

    Directory of Open Access Journals (Sweden)

    Pawel PAWLEWSKI

    2012-07-01

    Full Text Available Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} This paper aims to present the dilemma of simulation tool selection. Authors discuss the examples of methodologies of enterprises architectures (CIMOSA and GRAI where agent approach is used to solve planning and managing problems. Actually simulation is widely used and practically only one tool which can enable verification of complex systems. Many companies face the problem, which simulation tool is appropriate to use for verification. Selected tools based on ABS and DES are presented. Some tools combining DES and ABS approaches are described. Authors give some recommendation on selection process.

  7. Simulation based medical education; teaching normal delivery on intermediate fidelity simulator to medical students.

    Science.gov (United States)

    Shah, Nighat; Baig, Lubna; Shah, Nusrat; Hussain, Riffat; Aly, Syed Moyn

    2017-10-01

    To assess the effectiveness of medium fidelity simulator in teaching normal vaginal delivery to medical students. The quasi-experimental study was conducted at the professional development centre of the Jinnah Sindh Medical University, Karachi, from June to December 2015, and comprised medical students. Third-year medical students were included. They were divided into two groups. Group A was taught normal delivery through traditional PowerPoint and group B through simulator. The instruments used for assessing knowledge were pre-test and post-test, for skills of labour/delivery checklist of performance was used, and perception forms were filled to evaluate workshops/learning environment by students. Of the 76 participants, there were 36(47.4%) in group A and 40(52.6%) in group B. The overall mean age of the participants was 20.86±0.76 years in group B and 20.60±0.95 years in group A (p=0.19). The mean grade point average of the participants was 2.89±0.47 in group A and 2.87±0.48 in group B (p=0.81).Group B performed much better in skill of delivery having a mean score of 8.91±3.20compared to group A which had mean of 5.67±1.84 (pSimulation-based skill learning showed significantly better results.

  8. Simulating non-holonomic constraints within the LCP-based simulation framework

    DEFF Research Database (Denmark)

    Ellekilde, Lars-Peter; Petersen, Henrik Gordon

    2006-01-01

    be incorporated directly, and derive formalism for how the non-holonomic contact constraints can be modelled as a combination of non-holonomic equality constraints and ordinary contacts constraints. For each of these three we are able to guarantee solvability, when using Lemke's algorithm. A number of examples......In this paper, we will extend the linear complementarity problem-based rigid-body simulation framework with non-holonomic constraints. We consider three different types of such, namely equality, inequality and contact constraints. We show how non-holonomic equality and inequality constraints can...... are included to demonstrate the non-holonomic constraints. Udgivelsesdato: Marts...

  9. Simulation Learning: PC-Screen Based (PCSB) versus High Fidelity Simulation (HFS)

    Science.gov (United States)

    2013-08-01

    1.00 (0.00) Score Change (mean change and sd) Assemble Equipment (Yes/No) 1. Water soluble lubricant 2. Suction equipment 3. Selecting correct...0.92 (0.25) 0.64 (0.31) 0.86 (0.22) 0.98 (0.07) Mean score change and sd Procedural Steps (Yes/No) 1. Lubricate tube with water -soluble...w., Johnson, C., Hsu, E. and  Wasser , T. (2006). Using innovative  simulation modalities for civilian based, chemical, biological, radiological

  10. Monte-Carlo simulations of neutron shielding for the ATLAS forward region

    CERN Document Server

    Stekl, I; Kovalenko, V E; Vorobel, V; Leroy, C; Piquemal, F; Eschbach, R; Marquet, C

    2000-01-01

    The effectiveness of different types of neutron shielding for the ATLAS forward region has been studied by means of Monte-Carlo simulations and compared with the results of an experiment performed at the CERN PS. The simulation code is based on GEANT, FLUKA, MICAP and GAMLIB. GAMLIB is a new library including processes with gamma-rays produced in (n, gamma), (n, n'gamma) neutron reactions and is interfaced to the MICAP code. The effectiveness of different types of shielding against neutrons and gamma-rays, composed from different types of material, such as pure polyethylene, borated polyethylene, lithium-filled polyethylene, lead and iron, were compared. The results from Monte-Carlo simulations were compared to the results obtained from the experiment. The simulation results reproduce the experimental data well. This agreement supports the correctness of the simulation code used to describe the generation, spreading and absorption of neutrons (up to thermal energies) and gamma-rays in the shielding materials....

  11. Acidity constants from DFT-based molecular dynamics simulations

    International Nuclear Information System (INIS)

    Sulpizi, Marialore; Sprik, Michiel

    2010-01-01

    In this contribution we review our recently developed method for the calculation of acidity constants from density functional theory based molecular dynamics simulations. The method is based on a half reaction scheme in which protons are formally transferred from solution to the gas phase. The corresponding deprotonation free energies are computed from the vertical energy gaps for insertion or removal of protons. Combined to full proton transfer reactions, the deprotonation energies can be used to estimate relative acidity constants and also the Broensted pK a when the deprotonation free energy of a hydronium ion is used as a reference. We verified the method by investigating a series of organic and inorganic acids and bases spanning a wide range of pK a values (20 units). The thermochemical corrections for the biasing potentials assisting and directing the insertion are discussed in some detail.

  12. Benchmarking of the simulation of the ATLAS HaLL background

    International Nuclear Information System (INIS)

    Vincke, H.

    2000-01-01

    The LHC, mainly to be used as a proton-proton collider, providing collisions at energies of 14 TeV, will be operational in the year 2005. ATLAS, one of the LHC experiments, will provide high accuracy measurements concerning these p-p collisions. In these collisions also a high particle background is produced. This background was already calculated with the Monte Carlo simulation program FLUKA. Unfortunately, the prediction concerning this background rate is only understood within an uncertainty level of five. The main contribution of this factor can be seen as limited knowledge concerning the ability of FLUKA to simulate these kinds of scenarios. In order to reduce the uncertainty, benchmarking simulations of experiments similar to the ATLAS background situation were performed. The comparison of the simulations with the experiments proves to which extent FLUKA is able to provide reliable results concerning the ATLAS background situation. In order to perform this benchmark, an iron construction was irradiated by a hadron beam. The primary particles had ATLAS equivalent energies. Behind the iron structure, the remnants of the shower processes are measured and simulated. The simulation procedure and its encouraging results, including the comparison with the measured numbers, are presented and discussed in this work. (author)

  13. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  14. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Shi-hua Zhan

    2016-01-01

    Full Text Available Simulated annealing (SA algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters’ setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA algorithm to solve traveling salesman problem (TSP. LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.

  15. Simulation-based planning for theater air warfare

    Science.gov (United States)

    Popken, Douglas A.; Cox, Louis A., Jr.

    2004-08-01

    Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.

  16. GOCE gravity field simulation based on actual mission scenario

    Science.gov (United States)

    Pail, R.; Goiginger, H.; Mayrhofer, R.; Höck, E.; Schuh, W.-D.; Brockmann, J. M.; Krasbutter, I.; Fecher, T.; Gruber, T.

    2009-04-01

    In the framework of the ESA-funded project "GOCE High-level Processing Facility", an operational hardware and software system for the scientific processing (Level 1B to Level 2) of GOCE data has been set up by the European GOCE Gravity Consortium EGG-C. One key component of this software system is the processing of a spherical harmonic Earth's gravity field model and the corresponding full variance-covariance matrix from the precise GOCE orbit and calibrated and corrected satellite gravity gradiometry (SGG) data. In the framework of the time-wise approach a combination of several processing strategies for the optimum exploitation of the information content of the GOCE data has been set up: The Quick-Look Gravity Field Analysis is applied to derive a fast diagnosis of the GOCE system performance and to monitor the quality of the input data. In the Core Solver processing a rigorous high-precision solution of the very large normal equation systems is derived by applying parallel processing techniques on a PC cluster. Before the availability of real GOCE data, by means of a realistic numerical case study, which is based on the actual GOCE orbit and mission scenario and simulation data stemming from the most recent ESA end-to-end simulation, the expected GOCE gravity field performance is evaluated. Results from this simulation as well as recently developed features of the software system are presented. Additionally some aspects on data combination with complementary data sources are addressed.

  17. PPLN-waveguide-based polarization entangled QKD simulator

    Science.gov (United States)

    Gariano, John; Djordjevic, Ivan B.

    2017-08-01

    We have developed a comprehensive simulator to study the polarization entangled quantum key distribution (QKD) system, which takes various imperfections into account. We assume that a type-II SPDC source using a PPLN-based nonlinear optical waveguide is used to generate entangled photon pairs and implements the BB84 protocol, using two mutually unbiased basis with two orthogonal polarizations in each basis. The entangled photon pairs are then simulated to be transmitted to both parties; Alice and Bob, through the optical channel, imperfect optical elements and onto the imperfect detector. It is assumed that Eve has no control over the detectors, and can only gain information from the public channel and the intercept resend attack. The secure key rate (SKR) is calculated using an upper bound and by using actual code rates of LDPC codes implementable in FPGA hardware. After the verification of the simulation results, such as the pair generation rate and the number of error due to multiple pairs, for the ideal scenario, available in the literature, we then introduce various imperfections. Then, the results are compared to previously reported experimental results where a BBO nonlinear crystal is used, and the improvements in SKRs are determined for when a PPLN-waveguide is used instead.

  18. Simulations of Micropumps Based on Tilted Flexible Fibers

    Science.gov (United States)

    Hancock, Matthew; Elabbasi, Nagi; Demirel, Melik

    2015-11-01

    Pumping liquids at low Reynolds numbers is challenging because of the principle of reversibility. We report here a class of microfluidic pump designs based on tilted flexible structures that combines the concepts of cilia (flexible elastic elements) and rectifiers (e.g., Tesla valves, check valves). We demonstrate proof-of-concept with 2D and 3D fluid-structure interaction (FSI) simulations in COMSOL Multiphysics®of micropumps consisting of a source for oscillatory fluidic motion, e.g. a piston, and a channel lined with tilted flexible rods or sheets to provide rectification. When flow is against the rod tilt direction, the rods bend backward, narrowing the channel and increasing flow resistance; when flow is in the direction of rod tilt, the rods bend forward, widening the channel and decreasing flow resistance. The 2D and 3D simulations involve moving meshes whose quality is maintained by prescribing the mesh displacement on guide surfaces positioned on either side of each flexible structure. The prescribed displacement depends on structure bending and maintains mesh quality even for large deformations. Simulations demonstrate effective pumping even at Reynolds numbers as low as 0.001. Because rod rigidity may be specified independently of Reynolds number, in principle, rod rigidity may be reduced to enable pumping at arbitrarily low Reynolds numbers.

  19. High viscosity fluid simulation using particle-based method

    KAUST Repository

    Chang, Yuanzhang

    2011-03-01

    We present a new particle-based method for high viscosity fluid simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke\\'s law, is included in the traditional Navier-Stokes equation to simulate the movements of the high viscosity fluids. Benefiting from the Lagrangian nature of Smoothed Particle Hydrodynamics method, large flow deformation can be well handled easily and naturally. In addition, in order to eliminate the particle deficiency problem near the boundary, ghost particles are employed to enforce the solid boundary condition. Compared with Finite Element Methods with complicated and time-consuming remeshing operations, our method is much more straightforward to implement. Moreover, our method doesn\\'t need to store and compare to an initial rest state. The experimental results show that the proposed method is effective and efficient to handle the movements of highly viscous flows, and a large variety of different kinds of fluid behaviors can be well simulated by adjusting just one parameter. © 2011 IEEE.

  20. Emergency Evacuation of Hazardous Chemical Accidents Based on Diffusion Simulation

    Directory of Open Access Journals (Sweden)

    Jiang-Hua Zhang

    2017-01-01

    Full Text Available The recent rapid development of information technology, such as sensing technology, communications technology, and database, allows us to use simulation experiments for analyzing serious accidents caused by hazardous chemicals. Due to the toxicity and diffusion of hazardous chemicals, these accidents often lead to not only severe consequences and economic losses, but also traffic jams at the same time. Emergency evacuation after hazardous chemical accidents is an effective means to reduce the loss of life and property and to smoothly resume the transport network as soon as possible. This paper considers the dynamic changes of the hazardous chemicals’ concentration after their leakage and simulates the diffusion process. Based on the characteristics of emergency evacuation of hazardous chemical accidents, we build a mixed-integer programming model and design a heuristic algorithm using network optimization and diffusion simulation (hereafter NODS. We then verify the validity and feasibility of the algorithm using Jinan, China, as a computational example. In the end, we compare the results from different scenarios to explore the key factors affecting the effectiveness of the evacuation process.

  1. A SIMULATION OF CONTRACT FARMING USING AGENT BASED MODELING

    Directory of Open Access Journals (Sweden)

    Yuanita Handayati

    2016-12-01

    Full Text Available This study aims to simulate the effects of contract farming and farmer commitment to contract farming on supply chain performance by using agent based modeling as a methodology. Supply chain performance is represented by profits and service levels. The simulation results indicate that farmers should pay attention to customer requirements and plan their agricultural activities in order to fulfill these requirements. Contract farming helps farmers deal with demand and price uncertainties. We also find that farmer commitment is crucial to fulfilling contract requirements. This study contributes to this field from a conceptual as well as a practical point of view. From the conceptual point of view, our simulation results show that different levels of farmer commitment have an impact on farmer performance when implementing contract farming. From a practical point of view, the uncertainty faced by farmers and the market can be managed by implementing cultivation and harvesting scheduling, information sharing, and collective learning as ways of committing to contract farming.

  2. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  3. Fuzzy-based simulation of real color blindness.

    Science.gov (United States)

    Lee, Jinmi; dos Santos, Wellington P

    2010-01-01

    About 8% of men are affected by color blindness. That population is at a disadvantage since they cannot perceive a substantial amount of the visual information. This work presents two computational tools developed to assist color blind people. The first one tests color blindness and assess its severity. The second tool is based on Fuzzy Logic, and implements a method proposed to simulate real red and green color blindness in order to generate synthetic cases of color vision disturbance in a statistically significant amount. Our purpose is to develop correction tools and obtain a deeper understanding of the accessibility problems faced by people with chromatic visual impairment.

  4. Bayou Choctaw Well Integrity Grading Component Based on Geomechanical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Geotechnology & Engineering Dept.

    2016-09-08

    This letter report provides a Bayou Choctaw (BC) Strategic Petroleum Reserve (SPR) well grading system based on the geomechanical simulation. The analyses described in this letter were used to evaluate the caverns’ geomechanical effect on wellbore integrity, which is an important component in the well integrity grading system recently developed by Roberts et al. [2015]. Using these analyses, the wellbores for caverns BC-17 and 20 are expected to be significantly impacted by cavern geomechanics, BC-18 and 19 are expected to be medium impacted; and the other caverns are expected to be less impacted.

  5. Agent-based simulation of electricity markets. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Sensfuss, F.; Ragwitz, M. [Fraunhofer-Institut fuer Systemtechnik und Innovationsforschung (ISI), Karlsruhe (Germany); Genoese, M.; Moest, D. [Karlsruhe Univ. (T.H.) (Germany). Inst. fuer Industriebetriebslehre und Industrielle Produktion

    2007-07-01

    Liberalisation, climate policy and promotion of renewable energy are challenges to players of the electricity sector in many countries. Policy makers have to con-sider issues like market power, bounded rationality of players and the appear-ance of fluctuating energy sources in order to provide adequate legislation. Fur-thermore the interactions between markets and environmental policy instru-ments become an issue of increasing importance. A promising approach for the scientific analysis of these developments is the field of agent-based simulation. The goal of this article is to provide an overview of the current work applying this methodology to the analysis of electricity markets. (orig.)

  6. Preview-based sampling for controlling gaseous simulations

    KAUST Repository

    Huang, Ruoguan; Melek, Zeki; Keyser, John

    2011-01-01

    to maintain. During the high resolution simulation, a matching process ensures that the properties sampled from the low resolution simulation are maintained. This matching process keeps the different resolution simulations aligned even for complex systems

  7. Simulation based mask defect repair verification and disposition

    Science.gov (United States)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple

  8. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  9. A particle based simulation model for glacier dynamics

    Directory of Open Access Journals (Sweden)

    J. A. Åström

    2013-10-01

    Full Text Available A particle-based computer simulation model was developed for investigating the dynamics of glaciers. In the model, large ice bodies are made of discrete elastic particles which are bound together by massless elastic beams. These beams can break, which induces brittle behaviour. At loads below fracture, beams may also break and reform with small probabilities to incorporate slowly deforming viscous behaviour in the model. This model has the advantage that it can simulate important physical processes such as ice calving and fracturing in a more realistic way than traditional continuum models. For benchmarking purposes the deformation of an ice block on a slip-free surface was compared to that of a similar block simulated with a Finite Element full-Stokes continuum model. Two simulations were performed: (1 calving of an ice block partially supported in water, similar to a grounded marine glacier terminus, and (2 fracturing of an ice block on an inclined plane of varying basal friction, which could represent transition to fast flow or surging. Despite several approximations, including restriction to two-dimensions and simplified water-ice interaction, the model was able to reproduce the size distributions of the debris observed in calving, which may be approximated by universal scaling laws. On a moderate slope, a large ice block was stable and quiescent as long as there was enough of friction against the substrate. For a critical length of frictional contact, global sliding began, and the model block disintegrated in a manner suggestive of a surging glacier. In this case the fragment size distribution produced was typical of a grinding process.

  10. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  11. Internet-based system for simulation-based medical planning for cardiovascular disease.

    Science.gov (United States)

    Steele, Brooke N; Draney, Mary T; Ku, Joy P; Taylor, Charles A

    2003-06-01

    Current practice in vascular surgery utilizes only diagnostic and empirical data to plan treatments, which does not enable quantitative a priori prediction of the outcomes of interventions. We have previously described simulation-based medical planning methods to model blood flow in arteries and plan medical treatments based on physiologic models. An important consideration for the design of these patient-specific modeling systems is the accessibility to physicians with modest computational resources. We describe a simulation-based medical planning environment developed for the World Wide Web (WWW) using the Virtual Reality Modeling Language (VRML) and the Java programming language.

  12. A Simulation-Based LED Design Project in Photonics Instruction Based on Industry-University Collaboration

    Science.gov (United States)

    Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.

    2011-01-01

    In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…

  13. The effects of a concept map-based support tool on simulation-based inquiry learning

    NARCIS (Netherlands)

    Hagemans, M.G.; van der Meij, Hans; de Jong, Anthonius J.M.

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations,

  14. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    Science.gov (United States)

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  15. Mock ECHO: A Simulation-Based Medical Education Method.

    Science.gov (United States)

    Fowler, Rebecca C; Katzman, Joanna G; Comerci, George D; Shelley, Brian M; Duhigg, Daniel; Olivas, Cynthia; Arnold, Thomas; Kalishman, Summers; Monnette, Rebecca; Arora, Sanjeev

    2018-04-16

    This study was designed to develop a deeper understanding of the learning and social processes that take place during the simulation-based medical education for practicing providers as part of the Project ECHO® model, known as Mock ECHO training. The ECHO model is utilized to expand access to care of common and complex diseases by supporting the education of primary care providers with an interprofessional team of specialists via videoconferencing networks. Mock ECHO trainings are conducted through a train the trainer model targeted at leaders replicating the ECHO model at their organizations. Trainers conduct simulated teleECHO clinics while participants gain skills to improve communication and self-efficacy. Three focus groups, conducted between May 2015 and January 2016 with a total of 26 participants, were deductively analyzed to identify common themes related to simulation-based medical education and interdisciplinary education. Principal themes generated from the analysis included (a) the role of empathy in community development, (b) the value of training tools as guides for learning, (c) Mock ECHO design components to optimize learning, (d) the role of interdisciplinary education to build community and improve care delivery, (e) improving care integration through collaboration, and (f) development of soft skills to facilitate learning. Mock ECHO trainings offer clinicians the freedom to learn in a noncritical environment while emphasizing real-time multidirectional feedback and encouraging knowledge and skill transfer. The success of the ECHO model depends on training interprofessional healthcare providers in behaviors needed to lead a teleECHO clinic and to collaborate in the educational process. While building a community of practice, Mock ECHO provides a safe opportunity for a diverse group of clinician experts to practice learned skills and receive feedback from coparticipants and facilitators.

  16. Simulation research of acousto optic modulator drive based on Multisim

    Science.gov (United States)

    Wang, Shiqian; Guo, Yangkuan; Zhu, Lianqing; Na, Yunxiao; Zhang, Yinmin; Liu, Qianzhe

    2013-10-01

    The acousto optic modulator drive is mainly made with 2 amplitude shift keying (2ASK)circuit, pre-amplifier circuit and power operational amplifier circuit, and the simulation of the acousto optic modulator drive is realized. Firstly, the acousto optic modulator drive works as follows.The modulation function is realized by the analoged switch circuit, and the on-off of the analoged switch chip (CD4066) are controlled by the pulse signal generated by the electronic conversion circuit. The voltage amplification of the modulated signal is achieved by two reverse proportional operation implements voltage amplifier circuit, and the circuit is mainly made with the AD8001 chip. Then the amplified signal is transfered into a two-stage power operational amplifier circuit of class C which is mainly made with the chip of MRF158. Secondly, both of the simulating structures and the union debugging based on the designed system are realized by Multisim. Finally, obtaining the modulation signal of 150(MHz) frequency and 5(μs) pulse width illustrates that a 2ASk modulation of the 150 (MHz)carrier signal and the 20(kHz) modulation signal is achieved. Besides, as the frequency of input signal and amplitude of voltage change, the output power of the power operational amplifier circuit also changes, and the conclusion is drawn that the output power increases when the frequency of input signal decreases and the amplitude of voltage increases. The component selection of the drive's PCB design, the performance parameter and of the actual circuit and the debugging of the actual circuit are based on the simulation results.

  17. Simulation and characterization of silicon nanopillar-based nanoparticle sensors

    Science.gov (United States)

    Wasisto, Hutomo Suryo; Merzsch, Stephan; Huang, Kai; Stranz, Andrej; Waag, Andreas; Peiner, Erwin

    2013-05-01

    Nanopillar-based structures hold promise as highly sensitive resonant mass sensors for a new generation of aerosol nanoparticle (NP) detecting devices because of their very small masses. In this work, the possible use of a silicon nanopillar (SiNPL) array as a nanoparticle sensor is investigated. The sensor structures are created and simulated using a finite element modeling (FEM) tool of COMSOL Multiphysics 4.3 to study the resonant characteristics and the sensitivity of the SiNPL for femtogram NP mass detection. Instead of using 2D plate models or simple single 3D cylindrical pillar models, FEM is performed with SiNPLs in 3D structures based on the real geometry of experimental SiNPL arrays employing a piezoelectric stack for resonant excitation. In order to achieve an optimal structure and investigate the etching effect on the fabricated resonators, SiNPLs with different designs of meshes, sidewall profiles, lengths, and diameters are simulated and analyzed. To validate the FEM results, fabricated SiNPLs with a high aspect ratio of ~60 are employed and characterized in resonant frequency measurements. SiNPLs are mounted onto a piezoactuator inside a scanning electron microscope (SEM) chamber which can excite SiNPLs into lateral vibration. The measured resonant frequencies of the SiNPLs with diameters about 650 nm and heights about 40 μm range from 434.63 kHz to 458.21 kHz, which agree well with those simulated by FEM. Furthermore, the deflection of a SiNPL can be enhanced by increasing the applied piezoactuator voltage. By depositing different NPs (i.e., carbon, TiO2, SiO2, Ag, and Au NPs) on the SiNPLs, the decrease of the resonant frequency is clearly shown confirming their potential to be used as airborne NP mass sensor with femtogram resolution level.

  18. Biologically based modelling and simulation of carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Ouchi, Noriyuki B.

    2003-01-01

    The process of the carcinogenesis is studied by computer simulation. In general, we need a large number of experimental samples to detect mutations at low doses, but in practice it is difficult to get such a large number of data. To satisfy the requirements of the situation at low doses, it is good to study the process of carcinogenesis using biologically based mathematical model. We have mainly studied it by using as known as 'multi-stage model'; the model seems to get complicated, as we adopt the recent new findings of molecular biological experiments. Moreover, the basic idea of the multi-stage model is based on the epidemiologic data of log-log variation of cancer incidence with age, it seems to be difficult to compare with experimental data of irradiated cell culture system, which has been increasing in recent years. Taking above into consideration, we concluded that we had better make new model with following features: 1) a unit of the target system is a cell, 2) the new information of the molecular biology can be easily introduced, 3) having spatial coordinates for checking a colony formation or tumorigenesis. In this presentation, we will show the detail of the model and some simulation results about the carcinogenesis. (author)

  19. Simulation Based Studies in Software Engineering: A Matter of Validity

    Directory of Open Access Journals (Sweden)

    Breno Bernard Nicolau de França

    2015-04-01

    Full Text Available Despite the possible lack of validity when compared with other science areas, Simulation-Based Studies (SBS in Software Engineering (SE have supported the achievement of some results in the field. However, as it happens with any other sort of experimental study, it is important to identify and deal with threats to validity aiming at increasing their strength and reinforcing results confidence. OBJECTIVE: To identify potential threats to SBS validity in SE and suggest ways to mitigate them. METHOD: To apply qualitative analysis in a dataset resulted from the aggregation of data from a quasi-systematic literature review combined with ad-hoc surveyed information regarding other science areas. RESULTS: The analysis of data extracted from 15 technical papers allowed the identification and classification of 28 different threats to validity concerned with SBS in SE according Cook and Campbell’s categories. Besides, 12 verification and validation procedures applicable to SBS were also analyzed and organized due to their ability to detect these threats to validity. These results were used to make available an improved set of guidelines regarding the planning and reporting of SBS in SE. CONCLUSIONS: Simulation based studies add different threats to validity when compared with traditional studies. They are not well observed and therefore, it is not easy to identify and mitigate all of them without explicit guidance, as the one depicted in this paper.

  20. Analyst-to-Analyst Variability in Simulation-Based Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and open one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.

  1. Urban flood simulation based on the SWMM model

    Directory of Open Access Journals (Sweden)

    L. Jiang

    2015-05-01

    Full Text Available China is the nation with the fastest urbanization in the past decades which has caused serious urban flooding. Flood forecasting is regarded as one of the important flood mitigation methods, and is widely used in catchment flood mitigation, but is not widely used in urban flooding mitigation. This paper, employing the SWMM model, one of the widely used urban flood planning and management models, simulates the urban flooding of Dongguan City in the rapidly urbanized southern China. SWMM is first set up based on the DEM, digital map and underground pipeline network, then parameters are derived based on the properties of the subcatchment and the storm sewer conduits; the parameter sensitivity analysis shows the parameter robustness. The simulated results show that with the 1-year return period precipitation, the studied area will have no flooding, but for the 2-, 5-, 10- and 20-year return period precipitation, the studied area will be inundated. The results show the SWMM model is promising for urban flood forecasting, but as it has no surface runoff routing, the urban flooding could not be forecast precisely.

  2. Simulation of SOFCs based power generation system using Aspen

    Directory of Open Access Journals (Sweden)

    Pianko-Oprych Paulina

    2017-12-01

    Full Text Available This study presents a thermodynamic Aspen simulation model for Solid Oxide Fuel Cells, SOFCs, based power generation system. In the first step, a steady-state SOFCs system model was developed. The model includes the electrochemistry and the diffusion phenomena. The electrochemical model gives good agreement with experimental data in a wide operating range. Then, a parametric study has been conducted to estimate effects of the oxygen to carbon ratio, O/C, on reformer temperature, fuel cell temperature, fuel utilization, overall fuel cell performance, and the results are discussed in this paper. In the second step, a dynamic analysis of SOFCs characteristic has been developed. The aim of dynamic modelling was to find the response of the system against the fuel utilization and the O/C ratio variations. From the simulations, it was concluded that both developed models in the steady and dynamic state were reasonably accurate and can be used for system level optimization studies of the SOFC based power generation system.

  3. Enhancing food engineering education with interactive web-based simulations

    Directory of Open Access Journals (Sweden)

    Alexandros Koulouris

    2015-04-01

    Full Text Available In the traditional deductive approach in teaching any engineering topic, teachers would first expose students to the derivation of the equations that govern the behavior of a physical system and then demonstrate the use of equations through a limited number of textbook examples. This methodology, however, is rarely adequate to unmask the cause-effect and quantitative relationships between the system variables that the equations embody. Web-based simulation, which is the integration of simulation and internet technologies, has the potential to enhance the learning experience by offering an interactive and easily accessible platform for quick and effortless experimentation with physical phenomena.This paper presents the design and development of a web-based platform for teaching basic food engineering phenomena to food technology students. The platform contains a variety of modules (“virtual experiments” covering the topics of mass and energy balances, fluid mechanics and heat transfer. In this paper, the design and development of three modules for mass balances and heat transfer is presented. Each webpage representing an educational module has the following features: visualization of the studied phenomenon through graphs, charts or videos, computation through a mathematical model and experimentation.  The student is allowed to edit key parameters of the phenomenon and observe the effect of these changes on the outputs. Experimentation can be done in a free or guided fashion with a set of prefabricated examples that students can run and self-test their knowledge by answering multiple-choice questions.

  4. Advancing Simulation-Based Education in Pain Medicine.

    Science.gov (United States)

    Singh, Naileshni; Nielsen, Alison A; Copenhaver, David J; Sheth, Samir J; Li, Chin-Shang; Fishman, Scott M

    2018-02-27

    The Accreditation Council for Graduate Medical Education (ACGME) has recently implemented milestones and competencies as a framework for training fellows in Pain Medicine, but individual programs are left to create educational platforms and assessment tools that meet ACGME standards. In this article, we discuss the concept of milestone-based competencies and the inherent challenges for implementation in pain medicine. We consider simulation-based education (SBE) as a potential tool for the field to meet ACGME goals through advancing novel learning opportunities, engaging in clinically relevant scenarios, and mastering technical and nontechnical skills. The sparse literature on SBE in pain medicine is highlighted, and we describe our pilot experience, which exemplifies a nascent effort that encountered early difficulties in implementing and refining an SBE program. The many complexities in offering a sophisticated simulated pain curriculum that is valid, reliable, feasible, and acceptable to learners and teachers may only be overcome with coordinated and collaborative efforts among pain medicine training programs and governing institutions.

  5. Memoryless cooperative graph search based on the simulated annealing algorithm

    International Nuclear Information System (INIS)

    Hou Jian; Yan Gang-Feng; Fan Zhen

    2011-01-01

    We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1. Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip-consensus method based scheme is presented to update the key parameter—radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment. (interdisciplinary physics and related areas of science and technology)

  6. Simulation and event reconstruction inside the PandaRoot framework

    International Nuclear Information System (INIS)

    Spataro, S

    2008-01-01

    The PANDA detector will be located at the future GSI accelerator FAIR. Its primary objective is the investigation of strong interaction with anti-proton beams, in the range up to 15 GeV/c as momentum of the incoming anti-proton. The PANDA offline simulation framework is called 'PandaRoot', as it is based upon the ROOT 5.14 package. It is characterized by a high versatility; it allows to perform simulation and analysis, to run different event generators (EvtGen, Pluto, UrQmd), different transport models (Geant3, Geant4, Fluka) with the same code, thus to compare the results simply by changing few macro lines without recompiling at all. Moreover auto-configuration scripts allow installing the full framework easily in different Linux distributions and with different compilers (the framework was installed and tested in more than 10 Linux platforms) without further manipulation. The final data are in a tree format, easily accessible and readable through simple clicks on the root browsers. The presentation will report on the actual status of the computing development inside the PandaRoot framework, in terms of detector implementation and event reconstruction

  7. Monte Carlo-based simulation of dynamic jaws tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S. [Department of Molecular Imaging, Radiotherapy and Oncology, Universite Catholique de Louvain, 54 Avenue Hippocrate, 1200 Brussels, Belgium and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); 21 Century Oncology., 1240 D' onofrio, Madison, Wisconsin 53719 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Department of Radiotherapy and Oncology, Universite Catholique de Louvain, St-Luc University Hospital, 10 Avenue Hippocrate, 1200 Brussels (Belgium)

    2011-09-15

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is

  8. Monte Carlo-based simulation of dynamic jaws tomotherapy

    International Nuclear Information System (INIS)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S.

    2011-01-01

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis

  9. Live defibrillation in simulation-based medical education--a survey of simulation center practices and attitudes.

    Science.gov (United States)

    Turban, Joseph W; Peters, Deborah P; Berg, Benjamin W

    2010-02-01

    Resuscitation from cardiac arrhythmia, requiring cardioversion/defibrillation is a common simulation training scenario. Use of live defibrillation enhances simulation fidelity but is not without risk. This survey was conducted to describe the prevalence of live defibrillation use during training scenarios in healthcare simulation centers, and when used, if safety training was required before using live defibrillation. A convenience sample of attendees at the 7th annual International Meeting on Simulation in Healthcare (January 2007) was surveyed using a closed-ended 23-item survey instrument. Survey domains included responder and simulation center demographics, simulation center defibrillation safety policies, and attitudes toward defibrillation practices in simulation training environments. Fifty-seven individuals representing 39 simulation centers returned surveys, 29 of which were in the United States. Live defibrillation was used in 35 of the 39 centers (90%). A defibrillation safety training policy was in effect at 14 of 39 centers (36%). Formal training before using live defibrillation was considered necessary by 48 of 55 responders (87%). Forty-eight of 54 responders (89%) strongly agreed or agreed with the statement, "I feel using live defibrillation plays an important role in simulation-based education." Although most responders consider use of live defibrillation important and believe formal defibrillator safety training should be conducted before use, only about one third of the centers had a training policy in effect. It remains to be determined whether safety training before the use of live defibrillation during simulation-based education increases user safety.

  10. Simulation-based decision support for evaluating operational plans

    Directory of Open Access Journals (Sweden)

    Johan Schubert

    2015-12-01

    Full Text Available In this article, we describe simulation-based decision support techniques for evaluation of operational plans within effects-based planning. Using a decision support tool, developers of operational plans are able to evaluate thousands of alternative plans against possible courses of events and decide which of these plans are capable of achieving a desired end state. The objective of this study is to examine the potential of a decision support system that helps operational analysts understand the consequences of numerous alternative plans through simulation and evaluation. Operational plans are described in the effects-based approach to operations concept as a set of actions and effects. For each action, we examine several different alternative ways to perform the action. We use a representation where a plan consists of several actions that should be performed. Each action may be performed in one of several different alternative ways. Together these action alternatives make up all possible plan instances, which are represented as a tree of action alternatives that may be searched for the most effective sequence of alternative actions. As a test case, we use an expeditionary operation with a plan of 43 actions and several alternatives for these actions, as well as a scenario of 40 group actors. Decision support for planners is provided by several methods that analyze the impact of a plan on the 40 actors, e.g., by visualizing time series of plan performance. Detailed decision support for finding the most influential actions of a plan is presented by using sensitivity analysis and regression tree analysis. Finally, a decision maker may use the tool to determine the boundaries of an operation that it must not move beyond without risk of drastic failure. The significant contribution of this study is the presentation of an integrated approach for evaluation of operational plans.

  11. A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling

    Science.gov (United States)

    Jaxa-Rozen, M.

    2016-12-01

    The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).

  12. A simulator-based nuclear reactor emergency response training exercise.

    Science.gov (United States)

    Waller, Edward; Bereznai, George; Shaw, John; Chaput, Joseph; Lafortune, Jean-Francois

    Training offsite emergency response personnel basic awareness of onsite control room operations during nuclear power plant emergency conditions was the primary objective of a week-long workshop conducted on a CANDU® virtual nuclear reactor simulator available at the University of Ontario Institute of Technology, Oshawa, Canada. The workshop was designed to examine both normal and abnormal reactor operating conditions, and to observe the conditions in the control room that may have impact on the subsequent offsite emergency response. The workshop was attended by participants from a number of countries encompassing diverse job functions related to nuclear emergency response. Objectives of the workshop were to provide opportunities for participants to act in the roles of control room personnel under different reactor operating scenarios, providing a unique experience for participants to interact with the simulator in real-time, and providing increased awareness of control room operations during accident conditions. The ability to "pause" the simulator during exercises allowed the instructors to evaluate and critique the performance of participants, and to provide context with respect to potential offsite emergency actions. Feedback from the participants highlighted (i) advantages of observing and participating "hands-on" with operational exercises, (ii) their general unfamiliarity with control room operational procedures and arrangements prior to the workshop, (iii) awareness of the vast quantity of detailed control room procedures for both normal and transient conditions, and (iv) appreciation of the increased workload for the operators in the control room during a transient from normal operations. Based upon participant feedback, it was determined that the objectives of the training had been met, and that future workshops should be conducted.

  13. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  14. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework.

    Science.gov (United States)

    Tay, Charison; Khajuria, Ankur; Gupte, Chinmay

    2014-01-01

    Traditional orthopaedic training has followed an apprenticeship model whereby trainees enhance their skills by operating under guidance. However the introduction of limitations on training hours and shorter training programmes mean that alternative training strategies are required. To perform a literature review on simulation training in arthroscopy and devise a framework that structures different simulation techniques that could be used in arthroscopic training. A systematic search of Medline, Embase, Google Scholar and the Cochrane Databases were performed. Search terms included "virtual reality OR simulator OR simulation" and "arthroscopy OR arthroscopic". 14 studies evaluating simulators in knee, shoulder and hip arthroplasty were included. The majority of the studies demonstrated construct and transference validity but only one showed concurrent validity. More studies are required to assess its potential as a training and assessment tool, skills transference between simulators and to determine the extent of skills decay from prolonged delays in training. We also devised a "ladder of arthroscopic simulation" that provides a competency-based framework to implement different simulation strategies. The incorporation of simulation into an orthopaedic curriculum will depend on a coordinated approach between many bodies. But the successful integration of simulators in other areas of surgery supports a possible role for simulation in advancing orthopaedic education. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  15. Agent-based simulation of electricity markets : a literature review

    International Nuclear Information System (INIS)

    Sensfuss, F.; Genoese, M.; Genoese, M.; Most, D.

    2007-01-01

    The electricity sector in Europe and North America is undergoing considerable changes as a result of deregulation, issues related to climate change, and the integration of renewable resources within the electricity grid. This article reviewed agent-based simulation methods of analyzing electricity markets. The paper provided an analysis of research currently being conducted on electricity market designs and examined methods of modelling agent decisions. Methods of coupling long term and short term decisions were also reviewed. Issues related to single and multiple market analysis methods were discussed, as well as different approaches to integrating agent-based models with models of other commodities. The integration of transmission constraints within agent-based models was also discussed, and methods of measuring market efficiency were evaluated. Other topics examined in the paper included approaches to integrating investment decisions, carbon dioxide (CO 2 ) trading, and renewable support schemes. It was concluded that agent-based models serve as a test bed for the electricity sector, and will help to provide insights for future policy decisions. 74 refs., 6 figs

  16. Performance and perspectives of the diamond based Beam Condition Monitor for beam loss monitoring at CMS

    CERN Document Server

    AUTHOR|(CDS)2080862

    2015-01-01

    At CMS, a beam loss monitoring system is operated to protect the silicon detectors from high particle rates, arising from intense beam loss events. As detectors, poly-crystalline CVD diamond sensors are placed around the beam pipe at several locations inside CMS. In case of extremely high detector currents, the LHC beams are automatically extracted from the LHC rings.Diamond is the detector material of choice due to its radiation hardness. Predictions of the detector lifetime were made based on FLUKA monte-carlo simulations and irradiation test results from the RD42 collaboration, which attested no significant radiation damage over several years.During the LHC operational Run1 (2010 â?? 2013), the detector efficiencies were monitored. A signal decrease of about 50 times stronger than expectations was observed in the in-situ radiation environment. Electric field deformations due to charge carriers, trapped in radiation induced lattice defects, are responsible for this signal decrease. This so-called polarizat...

  17. Design, modeling and simulation of MEMS-based silicon Microneedles

    International Nuclear Information System (INIS)

    Amin, F; Ahmed, S

    2013-01-01

    The advancement in semiconductor process engineering and nano-scale fabrication technology has made it convenient to transport specific biological fluid into or out of human skin with minimum discomfort. Fluid transdermal delivery systems such as Microneedle arrays are one such emerging and exciting Micro-Electro Mechanical System (MEMS) application which could lead to a total painless fluid delivery into skin with controllability and desirable yield. In this study, we aimed to revisit the problem with modeling, design and simulations carried out for MEMS based silicon hollow out of plane microneedle arrays for biomedical applications particularly for transdermal drug delivery. An approximate 200 μm length of microneedle with 40 μm diameter of lumen has been successfully shown formed by isotropic and anisotropic etching techniques using MEMS Pro design tool. These microneedles are arranged in size of 2 × 4 matrix array with center to center spacing of 750 μm. Furthermore, comparisons for fluid flow characteristics through these microneedle channels have been modeled with and without the contribution of the gravitational forces using mathematical models derived from Bernoulli Equation. Physical Process simulations have also been performed on TCAD SILVACO to optimize the design of these microneedles aligned with the standard Si-Fabrication lines.

  18. Design, modeling and simulation of MEMS-based silicon Microneedles

    Science.gov (United States)

    Amin, F.; Ahmed, S.

    2013-06-01

    The advancement in semiconductor process engineering and nano-scale fabrication technology has made it convenient to transport specific biological fluid into or out of human skin with minimum discomfort. Fluid transdermal delivery systems such as Microneedle arrays are one such emerging and exciting Micro-Electro Mechanical System (MEMS) application which could lead to a total painless fluid delivery into skin with controllability and desirable yield. In this study, we aimed to revisit the problem with modeling, design and simulations carried out for MEMS based silicon hollow out of plane microneedle arrays for biomedical applications particularly for transdermal drug delivery. An approximate 200 μm length of microneedle with 40 μm diameter of lumen has been successfully shown formed by isotropic and anisotropic etching techniques using MEMS Pro design tool. These microneedles are arranged in size of 2 × 4 matrix array with center to center spacing of 750 μm. Furthermore, comparisons for fluid flow characteristics through these microneedle channels have been modeled with and without the contribution of the gravitational forces using mathematical models derived from Bernoulli Equation. Physical Process simulations have also been performed on TCAD SILVACO to optimize the design of these microneedles aligned with the standard Si-Fabrication lines.

  19. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    Science.gov (United States)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  20. Some results on ethnic conflicts based on evolutionary game simulation

    Science.gov (United States)

    Qin, Jun; Yi, Yunfei; Wu, Hongrun; Liu, Yuhang; Tong, Xiaonian; Zheng, Bojin

    2014-07-01

    The force of the ethnic separatism, essentially originating from the negative effect of ethnic identity, is damaging the stability and harmony of multiethnic countries. In order to eliminate the foundation of the ethnic separatism and set up a harmonious ethnic relationship, some scholars have proposed a viewpoint: ethnic harmony could be promoted by popularizing civic identity. However, this viewpoint is discussed only from a philosophical prospective and still lacks support of scientific evidences. Because ethnic group and ethnic identity are products of evolution and ethnic identity is the parochialism strategy under the perspective of game theory, this paper proposes an evolutionary game simulation model to study the relationship between civic identity and ethnic conflict based on evolutionary game theory. The simulation results indicate that: (1) the ratio of individuals with civic identity has a negative association with the frequency of ethnic conflicts; (2) ethnic conflict will not die out by killing all ethnic members once for all, and it also cannot be reduced by a forcible pressure, i.e., increasing the ratio of individuals with civic identity; (3) the average frequencies of conflicts can stay in a low level by promoting civic identity periodically and persistently.

  1. Simulated Annealing-Based Krill Herd Algorithm for Global Optimization

    Directory of Open Access Journals (Sweden)

    Gai-Ge Wang

    2013-01-01

    Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.

  2. Design and Simulation of an Electrothermal Actuator Based Rotational Drive

    Science.gov (United States)

    Beeson, Sterling; Dallas, Tim

    2008-10-01

    As a participant in the Micro and Nano Device Engineering (MANDE) Research Experience for Undergraduates program at Texas Tech University, I learned how MEMS devices operate and the limits of their operation. Using specialized AutoCAD-based design software and the ANSYS simulation program, I learned the MEMS fabrication process used at Sandia National Labs, the design limitations of this process, the abilities and drawbacks of micro devices, and finally, I redesigned a MEMS device called the Chevron Torsional Ratcheting Actuator (CTRA). Motion is achieved through electrothermal actuation. The chevron (bent-beam) actuators cause a ratcheting motion on top of a hub-less gear so that as voltage is applied the CTRA spins. The voltage applied needs to be pulsed and the frequency of the pulses determine the angular frequency of the device. The main objective was to design electromechanical structures capable of transforming the electrical signals into mechanical motion without overheating. The design was optimized using finite element analysis in ANSYS allowing multi-physics simulations of our model system.

  3. IR characteristic simulation of city scenes based on radiosity model

    Science.gov (United States)

    Xiong, Xixian; Zhou, Fugen; Bai, Xiangzhi; Yu, Xiyu

    2013-09-01

    Reliable modeling for thermal infrared (IR) signatures of real-world city scenes is required for signature management of civil and military platforms. Traditional modeling methods generally assume that scene objects are individual entities during the physical processes occurring in infrared range. However, in reality, the physical scene involves convective and conductive interactions between objects as well as the radiations interactions between objects. A method based on radiosity model describes these complex effects. It has been developed to enable an accurate simulation for the radiance distribution of the city scenes. Firstly, the physical processes affecting the IR characteristic of city scenes were described. Secondly, heat balance equations were formed on the basis of combining the atmospheric conditions, shadow maps and the geometry of scene. Finally, finite difference method was used to calculate the kinetic temperature of object surface. A radiosity model was introduced to describe the scattering effect of radiation between surface elements in the scene. By the synthesis of objects radiance distribution in infrared range, we could obtain the IR characteristic of scene. Real infrared images and model predictions were shown and compared. The results demonstrate that this method can realistically simulate the IR characteristic of city scenes. It effectively displays the infrared shadow effects and the radiation interactions between objects in city scenes.

  4. A PC based multi-CPU severe accident simulation trainer

    International Nuclear Information System (INIS)

    Jankowski, M.W.; Bienarz, P.P.; Sartmadjiev, A.D.

    2004-01-01

    MELSIM Severe Accident Simulation Trainer is a personal computer based system being developed by the International Atomic Energy Agency and Risk Management Associates, Inc. for the purpose of training the operators of nuclear power stations. It also serves for evaluating accident management strategies as well as assessing complex interfaces between emergency operating procedures and accident management guidelines. The system is being developed for the Soviet designed WWER-440/Model 213 reactor and it is plant specific. The Bohunice V2 power station in the Slovak Republic has been selected for trial operation of the system. The trainer utilizes several CPUs working simultaneously on different areas of simulation. Detailed plant operation displays are provided on colour monitor mimic screens which show changing plant conditions in approximate real-time. Up to 28 000 curves can be plotted on a separate monitor as the MELSIM program proceeds. These plots proceed concurrently with the program, and time specific segments can be recalled for review. A benchmarking (limited in scope) against well validated thermal-hydraulic codes and selected plant accident data (WWER-440/213 Rovno NPP, Ukraine) has been initiated. Preliminary results are presented and discussed. (author)

  5. Simulation tools for guided wave based structural health monitoring

    Science.gov (United States)

    Mesnil, Olivier; Imperiale, Alexandre; Demaldent, Edouard; Baronian, Vahan; Chapuis, Bastien

    2018-04-01

    Structural Health Monitoring (SHM) is a thematic derived from Non Destructive Evaluation (NDE) based on the integration of sensors onto or into a structure in order to monitor its health without disturbing its regular operating cycle. Guided wave based SHM relies on the propagation of guided waves in plate-like or extruded structures. Using piezoelectric transducers to generate and receive guided waves is one of the most widely accepted paradigms due to the low cost and low weight of those sensors. A wide range of techniques for flaw detection based on the aforementioned setup is available in the literature but very few of these techniques have found industrial applications yet. A major difficulty comes from the sensitivity of guided waves to a substantial number of parameters such as the temperature or geometrical singularities, making guided wave measurement difficult to analyze. In order to apply guided wave based SHM techniques to a wider spectrum of applications and to transfer those techniques to the industry, the CEA LIST develops novel numerical methods. These methods facilitate the evaluation of the robustness of SHM techniques for multiple applicative cases and ease the analysis of the influence of various parameters, such as sensors positioning or environmental conditions. The first numerical tool is the guided wave module integrated to the commercial software CIVA, relying on a hybrid modal-finite element formulation to compute the guided wave response of perturbations (cavities, flaws…) in extruded structures of arbitrary cross section such as rails or pipes. The second numerical tool is based on the spectral element method [2] and simulates guided waves in both isotropic (metals) and orthotropic (composites) plate like-structures. This tool is designed to match the widely accepted sparse piezoelectric transducer array SHM configuration in which each embedded sensor acts as both emitter and receiver of guided waves. This tool is under development and

  6. Micromechanics based simulation of ductile fracture in structural steels

    Science.gov (United States)

    Yellavajjala, Ravi Kiran

    The broader aim of this research is to develop fundamental understanding of ductile fracture process in structural steels, propose robust computational models to quantify the associated damage, and provide numerical tools to simplify the implementation of these computational models into general finite element framework. Mechanical testing on different geometries of test specimens made of ASTM A992 steels is conducted to experimentally characterize the ductile fracture at different stress states under monotonic and ultra-low cycle fatigue (ULCF) loading. Scanning electron microscopy studies of the fractured surfaces is conducted to decipher the underlying microscopic damage mechanisms that cause fracture in ASTM A992 steels. Detailed micromechanical analyses for monotonic and cyclic loading are conducted to understand the influence of stress triaxiality and Lode parameter on the void growth phase of ductile fracture. Based on monotonic analyses, an uncoupled micromechanical void growth model is proposed to predict ductile fracture. This model is then incorporated in to finite element program as a weakly coupled model to simulate the loss of load carrying capacity in the post microvoid coalescence regime for high triaxialities. Based on the cyclic analyses, an uncoupled micromechanics based cyclic void growth model is developed to predict the ULCF life of ASTM A992 steels subjected to high stress triaxialities. Furthermore, a computational fracture locus for ASTM A992 steels is developed and incorporated in to finite element program as an uncoupled ductile fracture model. This model can be used to predict the ductile fracture initiation under monotonic loading in a wide range of triaxiality and Lode parameters. Finally, a coupled microvoid elongation and dilation based continuum damage model is proposed, implemented, calibrated and validated. This model is capable of simulating the local softening caused by the various phases of ductile fracture process under

  7. Agent-based simulation of a financial market

    Science.gov (United States)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  8. Atomistic simulation of graphene-based polymer nanocomposites

    International Nuclear Information System (INIS)

    Rissanou, Anastassia N.; Bačová, Petra; Harmandaris, Vagelis

    2016-01-01

    Polymer/graphene nanostructured systems are hybrid materials which have attracted great attention the last years both for scientific and technological reasons. In the present work atomistic Molecular Dynamics simulations are performed for the study of graphene-based polymer nanocomposites composed of pristine, hydrogenated and carboxylated graphene sheets dispersed in polar (PEO) and nonpolar (PE) short polymer matrices (i.e., matrices containing chains of low molecular weight). Our focus is twofold; the one is the study of the structural and dynamical properties of short polymer chains and the way that they are affected by functionalized graphene sheets while the other is the effect of the polymer matrices on the behavior of graphene sheets.

  9. Learning Theory Foundations of Simulation-Based Mastery Learning.

    Science.gov (United States)

    McGaghie, William C; Harris, Ilene B

    2018-06-01

    Simulation-based mastery learning (SBML), like all education interventions, has learning theory foundations. Recognition and comprehension of SBML learning theory foundations are essential for thoughtful education program development, research, and scholarship. We begin with a description of SBML followed by a section on the importance of learning theory foundations to shape and direct SBML education and research. We then discuss three principal learning theory conceptual frameworks that are associated with SBML-behavioral, constructivist, social cognitive-and their contributions to SBML thought and practice. We then discuss how the three learning theory frameworks converge in the course of planning, conducting, and evaluating SBML education programs in the health professions. Convergence of these learning theory frameworks is illustrated by a description of an SBML education and research program in advanced cardiac life support. We conclude with a brief coda.

  10. Simulation-based biagnostics and control for nuclar power plants

    International Nuclear Information System (INIS)

    Lee, J.C.

    1993-01-01

    Advanced simulation-based diagnostics and control guidance systems for the identification and management of off-normal transient events in nuclear power plants is currently under investigation. To date a great deal of progress has been made in effectively and efficiently combining information obtained through fuzzy pattern recognition and macroscopic mass and energy inventory analysis for use in multiple failure diagnostics. Work has also begun on the unique problem of diagnostics and surveillance methodologies for advanced passively-safe reactors systems utilizing both statistical and fuzzy information. Plans are also being formulated for the development of deterministic optimal control algorithms combined with Monte Carlo incremental learning algorithms to be used for the flexible and efficient control of reactor transients

  11. Web-Based Modelling and Collaborative Simulation of Declarative Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Marquard, Morten; Shahzad, Muhammad

    2015-01-01

    -user discussions on how knowledge workers really work, by enabling collaborative simulation of processes. In earlier work we reported on the integration of DCR Graphs as a workflow execution formalism in the existing Exformatics ECM products. In this paper we report on the advances we have made over the last two......As a provider of Electronic Case Management solutions to knowledge-intensive businesses and organizations, the Danish company Exformatics has in recent years identified a need for flexible process support in the tools that we provide to our customers. We have addressed this need by adapting DCR...... Graphs, a formal declarative workflow notation developed at the IT University of Copenhagen. Through close collaboration with academia we first integrated execution support for the notation into our existing tools, by leveraging a cloud-based process engine implementing the DCR formalism. Over the last...

  12. Quantum-based Atomistic Simulation of Transition Metals

    International Nuclear Information System (INIS)

    Moriarty, J A; Benedict, L X; Glosli, J N; Hood, R Q; Orlikowski, D A; Patel, M V; Soderlind, P; Streitz, F H; Tang, M; Yang, L H

    2005-01-01

    First-principles generalized pseudopotential theory (GPT) provides a fundamental basis for transferable multi-ion interatomic potentials in d-electron transition metals within density-functional quantum mechanics. In mid-period bcc metals, where multi-ion angular forces are important to structural properties, simplified model GPT or MGPT potentials have been developed based on canonical d bands to allow analytic forms and large-scale atomistic simulations. Robust, advanced-generation MGPT potentials have now been obtained for Ta and Mo and successfully applied to a wide range of structural, thermodynamic, defect and mechanical properties at both ambient and extreme conditions of pressure and temperature. Recent algorithm improvements have also led to a more general matrix representation of MGPT beyond canonical bands allowing increased accuracy and extension to f-electron actinide metals, an order of magnitude increase in computational speed, and the current development of temperature-dependent potentials

  13. Study of Flapping Flight Using Discrete Vortex Method Based Simulations

    Science.gov (United States)

    Devranjan, S.; Jalikop, Shreyas V.; Sreenivas, K. R.

    2013-12-01

    In recent times, research in the area of flapping flight has attracted renewed interest with an endeavor to use this mechanism in Micro Air vehicles (MAVs). For a sustained and high-endurance flight, having larger payload carrying capacity we need to identify a simple and efficient flapping-kinematics. In this paper, we have used flow visualizations and Discrete Vortex Method (DVM) based simulations for the study of flapping flight. Our results highlight that simple flapping kinematics with down-stroke period (tD) shorter than the upstroke period (tU) would produce a sustained lift. We have identified optimal asymmetry ratio (Ar = tD/tU), for which flapping-wings will produce maximum lift and find that introducing optimal wing flexibility will further enhances the lift.

  14. Radiography simulation based on exposure buildup factors for multilayer structures

    International Nuclear Information System (INIS)

    Marinkovic, Predrag; Pesic, Milan

    2009-01-01

    Monte Carlo techniques were usually used to study the effect of scattered photons on a radiographic X-ray image. Such approach is accurate, but computer time consuming. On the other hand, the exposure buildup factors can be used as approximate and efficient assessment to account for the scattering of X-rays. This method uses the known radiography parameters to find the resulting detector exposure due to both scattered and un-collided photons. A model for radiography simulation, based on X-ray dose buildup factor, is proposed. This model includes non-uniform attenuation in voxelized object of imaging (patient body tissue). Composition of patient body is considered as a multi-layer structure. Various empirical formulas exist for multi-layer structure calculations and they all calculate multi-layer buildup factors by combining single-layer buildup factors. The proposed model is convenient in cases when more exact techniques (like Monte Carlo) are not economical. (author)

  15. Theory-based transport simulation of tokamaks: density scaling

    International Nuclear Information System (INIS)

    Ghanem, E.S.; Kinsey, J.; Singer, C.; Bateman, G.

    1992-01-01

    There has been a sizeable amount of work in the past few years using theoretically based flux-surface-average transport models to simulate various types of experimental tokamak data. Here we report two such studies, concentrating on the response of the plasma to variation of the line averaged electron density. The first study reported here uses a transport model described by Ghanem et al. to examine the response of global energy confinement time in ohmically heated discharges. The second study reported here uses a closely related and more recent transport model described by Bateman to examine the response of temperature profiles to changes in line-average density in neutral-beam-heated discharges. Work on developing a common theoretical model for these and other scaling experiments is in progress. (author) 5 refs., 2 figs

  16. Capacity Analysis for Parallel Runway through Agent-Based Simulation

    Directory of Open Access Journals (Sweden)

    Yang Peng

    2013-01-01

    Full Text Available Parallel runway is the mainstream structure of China hub airport, runway is often the bottleneck of an airport, and the evaluation of its capacity is of great importance to airport management. This study outlines a model, multiagent architecture, implementation approach, and software prototype of a simulation system for evaluating runway capacity. Agent Unified Modeling Language (AUML is applied to illustrate the inbound and departing procedure of planes and design the agent-based model. The model is evaluated experimentally, and the quality is studied in comparison with models, created by SIMMOD and Arena. The results seem to be highly efficient, so the method can be applied to parallel runway capacity evaluation and the model propose favorable flexibility and extensibility.

  17. Simulation-based algorithms for Markov decision processes

    CERN Document Server

    Chang, Hyeong Soo; Fu, Michael C; Marcus, Steven I

    2013-01-01

    Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.  Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dimensionality and so making practical solution of the resulting models intractable.  In other cases, the system of interest is too complex to allow explicit specification of some of the MDP model parameters, but simulation samples are readily available (e.g., for random transitions and costs). For these settings, various sampling and population-based algorithms have been developed to overcome the difficulties of computing an optimal solution in terms of a policy and/or value function.  Specific approaches include adaptive sampling, evolutionary policy iteration, evolutionary random policy search, and model reference adaptive search. This substantially enlarged new edition reflects the latest developments in novel ...

  18. Fundamental Science-Based Simulation of Nuclear Waste Forms

    Energy Technology Data Exchange (ETDEWEB)

    Devanathan, Ramaswami; Gao, Fei; Sun, Xin; Khaleel, Mohammad A.

    2010-10-04

    This report presents a hierarchical multiscale modeling scheme based on two-way information exchange. To account for all essential phenomena in waste forms over geological time scales, the models have to span length scales from nanometer to kilometer and time scales from picoseconds to millenia. A single model cannot cover this wide range and a multi-scale approach that integrates a number of different at-scale models is called for. The approach outlined here involves integration of quantum mechanical calculations, classical molecular dynamics simulations, kinetic Monte Carlo and phase field methods at the mesoscale, and continuum models. The ultimate aim is to provide science-based input in the form of constitutive equations to integrated codes. The atomistic component of this scheme is demonstrated in the promising waste form xenotime. Density functional theory calculations have yielded valuable information about defect formation energies. This data can be used to develop interatomic potentials for molecular dynamics simulations of radiation damage. Potentials developed in the present work show a good match for the equilibrium lattice constants, elastic constants and thermal expansion of xenotime. In novel waste forms, such as xenotime, a considerable amount of data needed to validate the models is not available. Integration of multiscale modeling with experimental work is essential to generate missing data needed to validate the modeling scheme and the individual models. Density functional theory can also be used to fill knowledge gaps. Key challenges lie in the areas of uncertainty quantification, verification and validation, which must be performed at each level of the multiscale model and across scales. The approach used to exchange information between different levels must also be rigorously validated. The outlook for multiscale modeling of wasteforms is quite promising.

  19. An Investigation of Computer-based Simulations for School Crises Management.

    Science.gov (United States)

    Degnan, Edward; Bozeman, William

    2001-01-01

    Describes development of a computer-based simulation program for training school personnel in crisis management. Addresses the data collection and analysis involved in developing a simulated event, the systems requirements for simulation, and a case study of application and use of the completed simulation. (Contains 21 references.) (Authors/PKP)

  20. Agent-based modeling and simulation Part 3 : desktop ABMS.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  1. Simulation-Based Dynamic Passenger Flow Assignment Modelling for a Schedule-Based Transit Network

    Directory of Open Access Journals (Sweden)

    Xiangming Yao

    2017-01-01

    Full Text Available The online operation management and the offline policy evaluation in complex transit networks require an effective dynamic traffic assignment (DTA method that can capture the temporal-spatial nature of traffic flows. The objective of this work is to propose a simulation-based dynamic passenger assignment framework and models for such applications in the context of schedule-based rail transit systems. In the simulation framework, travellers are regarded as individual agents who are able to obtain complete information on the current traffic conditions. A combined route selection model integrated with pretrip route selection and entrip route switch is established for achieving the dynamic network flow equilibrium status. The train agent is operated strictly with the timetable and its capacity limitation is considered. A continuous time-driven simulator based on the proposed framework and models is developed, whose performance is illustrated through a large-scale network of Beijing subway. The results indicate that more than 0.8 million individual passengers and thousands of trains can be simulated simultaneously at a speed ten times faster than real time. This study provides an efficient approach to analyze the dynamic demand-supply relationship for large schedule-based transit networks.

  2. Operational characteristic analysis of conduction cooling HTS SMES for Real Time Digital Simulator based power quality enhancement simulation

    International Nuclear Information System (INIS)

    Kim, A.R.; Kim, G.H.; Kim, K.M.; Kim, D.W.; Park, M.; Yu, I.K.; Kim, S.H.; Sim, K.; Sohn, M.H.; Seong, K.C.

    2010-01-01

    This paper analyzes the operational characteristics of conduction cooling Superconducting Magnetic Energy Storage (SMES) through a real hardware based simulation. To analyze the operational characteristics, the authors manufactured a small-scale toroidal-type SMES and implemented a Real Time Digital Simulator (RTDS) based power quality enhancement simulation. The method can consider not only electrical characteristics such as inductance and current but also temperature characteristic by using the real SMES system. In order to prove the effectiveness of the proposed method, a voltage sag compensation simulation has been implemented using the RTDS connected with the High Temperature Superconducting (HTS) model coil and DC/DC converter system, and the simulation results are discussed in detail.

  3. The Geant4-Based ATLAS Fast Electromagnetic Shower Simulation

    CERN Document Server

    Barberio, E; Butler, B; Cheung, S L; Dell'Acqua, A; Di Simone, A; Ehrenfeld, W; Gallas, M V; Glasow, A; Hughes, E; Marshall, Z; Müller, J; Placakyte, R; Rimoldi, A; Savard, P; Tsulaia, V; Waugh, A; Young, C C; 10th ICATPP Conference on Astroparticle, Particle, Space Physics, Detectors and Medical Physics Applications

    2008-01-01

    We present a three-pronged approach to fast electromagnetic shower simulation in ATLAS. Parameterisation is used for high-energy, shower libraries for medium-energy, and an averaged energy deposition for very low-energy particles. We present a comparison between the fast simulation and full simulation in an ATLAS Monte Carlo production.

  4. High tech supply chain simulation based on dynamical systems model

    NARCIS (Netherlands)

    Yuan, X.; Ashayeri, J.

    2013-01-01

    During the last 45 years, system dynamics as a continuous type of simulation has been used for simulating various problems, ranging from economic to engineering and managerial when limited (historical) information is available. Control theory is another alternative for continuous simulation that

  5. Simulated BRDF based on measured surface topography of metal

    Science.gov (United States)

    Yang, Haiyue; Haist, Tobias; Gronle, Marc; Osten, Wolfgang

    2017-06-01

    The radiative reflective properties of a calibration standard rough surface were simulated by ray tracing and the Finite-difference time-domain (FDTD) method. The simulation results have been used to compute the reflectance distribution functions (BRDF) of metal surfaces and have been compared with experimental measurements. The experimental and simulated results are in good agreement.

  6. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    Science.gov (United States)

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  8. A simulation-based analytic model of radio galaxies

    Science.gov (United States)

    Hardcastle, M. J.

    2018-04-01

    I derive and discuss a simple semi-analytical model of the evolution of powerful radio galaxies which is not based on assumptions of self-similar growth, but rather implements some insights about the dynamics and energetics of these systems derived from numerical simulations, and can be applied to arbitrary pressure/density profiles of the host environment. The model can qualitatively and quantitatively reproduce the source dynamics and synchrotron light curves derived from numerical modelling. Approximate corrections for radiative and adiabatic losses allow it to predict the evolution of radio spectral index and of inverse-Compton emission both for active and `remnant' sources after the jet has turned off. Code to implement the model is publicly available. Using a standard model with a light relativistic (electron-positron) jet, subequipartition magnetic fields, and a range of realistic group/cluster environments, I simulate populations of sources and show that the model can reproduce the range of properties of powerful radio sources as well as observed trends in the relationship between jet power and radio luminosity, and predicts their dependence on redshift and environment. I show that the distribution of source lifetimes has a significant effect on both the source length distribution and the fraction of remnant sources expected in observations, and so can in principle be constrained by observations. The remnant fraction is expected to be low even at low redshift and low observing frequency due to the rapid luminosity evolution of remnants, and to tend rapidly to zero at high redshift due to inverse-Compton losses.

  9. Simulation-based optimization of sustainable national energy systems

    International Nuclear Information System (INIS)

    Batas Bjelić, Ilija; Rajaković, Nikola

    2015-01-01

    The goals of the EU2030 energy policy should be achieved cost-effectively by employing the optimal mix of supply and demand side technical measures, including energy efficiency, renewable energy and structural measures. In this paper, the achievement of these goals is modeled by introducing an innovative method of soft-linking of EnergyPLAN with the generic optimization program (GenOpt). This soft-link enables simulation-based optimization, guided with the chosen optimization algorithm, rather than manual adjustments of the decision vectors. In order to obtain EnergyPLAN simulations within the optimization loop of GenOpt, the decision vectors should be chosen and explained in GenOpt for scenarios created in EnergyPLAN. The result of the optimization loop is an optimal national energy master plan (as a case study, energy policy in Serbia was taken), followed with sensitivity analysis of the exogenous assumptions and with focus on the contribution of the smart electricity grid to the achievement of EU2030 goals. It is shown that the increase in the policy-induced total costs of less than 3% is not significant. This general method could be further improved and used worldwide in the optimal planning of sustainable national energy systems. - Highlights: • Innovative method of soft-linking of EnergyPLAN with GenOpt has been introduced. • Optimal national energy master plan has been developed (the case study for Serbia). • Sensitivity analysis on the exogenous world energy and emission price development outlook. • Focus on the contribution of smart energy systems to the EU2030 goals. • Innovative soft-linking methodology could be further improved and used worldwide.

  10. Unexpected collateral effects of simulation-based medical education.

    Science.gov (United States)

    Barsuk, Jeffrey H; Cohen, Elaine R; Feinglass, Joe; McGaghie, William C; Wayne, Diane B

    2011-12-01

    Internal medicine residents who complete simulation-based education (SBE) in central venous catheter (CVC) insertion acquire improved skills that yield better patient care outcomes. The collateral effects of SBE on the skills of residents who have not yet experienced SBE are unknown. In this retrospective, observational study, the authors used a checklist to test the internal jugular and subclavian CVC insertion skills of 102 Northwestern University second- and third-year internal medicine residents before they received simulation training. The authors compared, across consecutive academic years (2007-2008, 2008-2009, 2009-2010), mean pretraining scores and the percent of trainees who met or surpassed a minimum passing score (MPS). Mean internal jugular pretest scores improved from 46.7% (standard deviation = 20.8%) in 2007 to 55.7% (±22.5%) in 2008 and 70.8% (±22.4%) in 2009 (P < .001). Mean subclavian pretest scores changed from 48.3% (±25.5%) in 2007 to 45.6% (±31.0%) in 2008 and 63.6% (±27.3%) in 2009 (P = .04). The percentage of residents who met or surpassed the MPS before training for internal jugular insertion was 7% in 2007, 16% in 2008, and 38% in 2009 (P = .004); for subclavian insertion, the percentage was 11% in 2007, 19% in 2008, and 38% in 2009 (P = .028). SBE for senior residents had an effect on junior trainees, as evidenced by pretraining CVC insertion skill improvement across three consecutive years. SBE for a targeted group of residents has implications for skill acquisition among other trainees.

  11. A calculation method for RF couplers design based on numerical simulation by microwave studio

    International Nuclear Information System (INIS)

    Wang Rong; Pei Yuanji; Jin Kai

    2006-01-01

    A numerical simulation method for coupler design is proposed. It is based on the matching procedure for the 2π/3 structure given by Dr. R.L. Kyhl. Microwave Studio EigenMode Solver is used for such numerical simulation. the simulation for a coupler has been finished with this method and the simulation data are compared with experimental measurements. The results show that this numerical simulation method is feasible for coupler design. (authors)

  12. Numerical simulation of CICC design based on optimization of ratio of copper to superconductor

    International Nuclear Information System (INIS)

    Jiang Huawei; Li Yuan; Yan Shuailing

    2007-01-01

    For cable-in-conduit conductor (CICC) structure design, a numeric simulation is proposed for conductor configuration based on optimization of ratio of copper to superconductor. The simulation outcome is in agreement with engineering design one. (authors)

  13. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  14. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Sugawara, Akihiro; Kishimoto, Yasuaki

    2003-01-01

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  15. An Efficient WSN Simulator for GPU-Based Node Performance

    OpenAIRE

    Kang, An Na; Kim, Hyun-Woo; Barolli, Leonard; Jeong, Young-Sik

    2013-01-01

    In wireless sensor network, when these sensors are wrongly placed in an observation region, they can quickly run out of batteries or be disconnected. These incidents may result in huge losses in terms of sensing data from numerous sensors and their costs. For this reason, a number of simulators have been developed as tools for effective design and verification before the actual arrangement of sensors. While a number of simulators have been developed, simulation results can be fairly limited a...

  16. Believability in simplifications of large scale physically based simulation

    KAUST Repository

    Han, Donghui; Hsu, Shu-wei; McNamara, Ann; Keyser, John

    2013-01-01

    We verify two hypotheses which are assumed to be true only intuitively in many rigid body simulations. I: In large scale rigid body simulation, viewers may not be able to perceive distortion incurred by an approximated simulation method. II: Fixing objects under a pile of objects does not affect the visual plausibility. Visual plausibility of scenarios simulated with these hypotheses assumed true are measured using subjective rating from viewers. As expected, analysis of results supports the truthfulness of the hypotheses under certain simulation environments. However, our analysis discovered four factors which may affect the authenticity of these hypotheses: number of collisions simulated simultaneously, homogeneity of colliding object pairs, distance from scene under simulation to camera position, and simulation method used. We also try to find an objective metric of visual plausibility from eye-tracking data collected from viewers. Analysis of these results indicates that eye-tracking does not present a suitable proxy for measuring plausibility or distinguishing between types of simulations. © 2013 ACM.

  17. Simulation Analysis of SPWM Variable Frequency Speed Based on Simulink

    Directory of Open Access Journals (Sweden)

    Min-Yan DI

    2014-01-01

    Full Text Available This article is studied on currently a very active field of researching sinusoidal pulse width modulation (SPWM frequency speed control system, and strengthen researched on the simulation model of speed control system with MATLAB / Simulink / Power System simulation tools, thus we can find the best way to simulation. We apply it to the actual conveyor belt, frequency conversion motor, when the obtained simulation results are compared with the measured data, we prove that the method is practical and effective. The results of our research have a guiding role for the future engineering and technical personnel in asynchronous motor SPWM VVVF CAD design.

  18. A MATLAB/Simulink based GUI for the CERES Simulator

    Science.gov (United States)

    Valencia, Luis H.

    1995-01-01

    The Clouds and The Earth's Radiant Energy System (CERES) simulator will allow flight operational familiarity with the CERES instrument prior to launch. It will provide a CERES instrument simulation facility for NASA Langley Research Center. NASA Goddard Space Flight Center and TRW. One of the objectives of building this simulator would be for use as a testbed for functionality checking of atypical memory uploads and for anomaly investigation. For instance, instrument malfunction due to memory damage requires troubleshooting on a simulator to determine the nature of the problem and to find a solution.

  19. Simulation-based education for building clinical teams

    Directory of Open Access Journals (Sweden)

    Marshall Stuart

    2010-01-01

    Full Text Available Failure to work as an effective team is commonly cited as a cause of adverse events and errors in emergency medicine. Until recently, individual knowledge and skills in managing emergencies were taught, without reference to the additional skills required to work as part of a team. Team training courses are now becoming commonplace, however their strategies and modes of delivery are varied. Just as different delivery methods of traditional education can result in different levels of retention and transfer to the real world, the same is true in team training of the material in different ways in traditional forms of education may lead to different levels of retention and transfer to the real world, the same is true in team training. As team training becomes more widespread, the effectiveness of different modes of delivery including the role of simulation-based education needs to be clearly understood. This review examines the basis of team working in emergency medicine, and the components of an effective emergency medical team. Lessons from other domains with more experience in team training are discussed, as well as the variations from these settings that can be observed in medical contexts. Methods and strategies for team training are listed, and experiences in other health care settings as well as emergency medicine are assessed. Finally, best practice guidelines for the development of team training programs in emergency medicine are presented.

  20. Simulation-based optimal Bayesian experimental design for nonlinear systems

    KAUST Repository

    Huan, Xun

    2013-01-01

    The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters.Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics. © 2012 Elsevier Inc.

  1. A simulator-based study of in-flight auscultation.

    Science.gov (United States)

    Tourtier, Jean-Pierre; Libert, Nicolas; Clapson, Patrick; Dubourdieu, Stéphane; Jost, Daniel; Tazarourte, Karim; Astaud, Cécil-Emmanuel; Debien, Bruno; Auroy, Yves

    2014-04-01

    The use of a stethoscope is essential to the delivery of continuous, supportive en route care during aeromedical evacuations. We compared the capability of 2 stethoscopes (electronic, Litmann 3000; conventional, Litmann Cardiology III) at detecting pathologic heart and lung sounds, aboard a C135, a medical transport aircraft. Sounds were mimicked using a mannequin-based simulator SimMan. Five practitioners examined the mannequin during a fly, with a variety of abnormalities as follows: crackles, wheezing, right and left lung silence, as well as systolic, diastolic, and Austin-Flint murmur. The comparison for diagnosis assessed (correct or wrong) between using the electronic and conventional stethoscopes were performed as a McNemar test. A total of 70 evaluations were performed. For cardiac sounds, diagnosis was right in 0/15 and 4/15 auscultations, respectively, with conventional and electronic stethoscopes (McNemar test, P = 0.13). For lung sounds, right diagnosis was found with conventional stethoscope in 10/20 auscultations versus 18/20 with electronic stethoscope (P = 0.013). Flight practitioners involved in aeromedical evacuation on C135 plane are more able to practice lung auscultation on a mannequin with this amplified stethoscope than with the traditional one. No benefit was found for heart sounds.

  2. Crystallization from a milk-based revised simulated body fluid

    International Nuclear Information System (INIS)

    Dorozhkin, Sergey V; Dorozhkina, Elena I

    2007-01-01

    A milk-based revised simulated body fluid (milk-rSBF) was prepared by a conventional route but instead of deionized water, all necessary chemicals were dissolved in whole cow's milk (3.2% fat). In order to accelerate crystallization and increase the amount of precipitates, the influence of milk was studied from condensed solutions equal to four times the ionic concentrations of rSBF (4rSBF). The experiments were performed under physiological conditions (solution pH = 7.35-7.40, temperature 37.0 ± 0.2 deg. C, duration 7 days) in a constant-composition double-diffusion device, which provided a slow crystallization under strictly controlled conditions. Similar experiments with 4rSBF but dissolved in deionized water were used as a control. An extra set of experiments with 4rSBF dissolved in deionized water but with an addition of 40 g l -1 bovine serum albumin (BSA) was used as another control. The influence of milk appeared to be similar to that of dissolved BSA: some components of milk (presumably albumins and proteins) were found to co-precipitate with calcium phosphates, which had a strong negative influence on both the crystallinity and the crystal sizes of the precipitates. In addition, both milk and BSA strongly inhibited crystallization of calcium phosphates: the precipitates turned out to contain a minor amount of calcium phosphates and a substantial amount of organic phase

  3. Simulation-based marginal likelihood for cluster strong lensing cosmology

    Science.gov (United States)

    Killedar, M.; Borgani, S.; Fabjan, D.; Dolag, K.; Granato, G.; Meneghetti, M.; Planelles, S.; Ragone-Figueroa, C.

    2018-01-01

    Comparisons between observed and predicted strong lensing properties of galaxy clusters have been routinely used to claim either tension or consistency with Λ cold dark matter cosmology. However, standard approaches to such cosmological tests are unable to quantify the preference for one cosmology over another. We advocate approximating the relevant Bayes factor using a marginal likelihood that is based on the following summary statistic: the posterior probability distribution function for the parameters of the scaling relation between Einstein radii and cluster mass, α and β. We demonstrate, for the first time, a method of estimating the marginal likelihood using the X-ray selected z > 0.5 Massive Cluster Survey clusters as a case in point and employing both N-body and hydrodynamic simulations of clusters. We investigate the uncertainty in this estimate and consequential ability to compare competing cosmologies, which arises from incomplete descriptions of baryonic processes, discrepancies in cluster selection criteria, redshift distribution and dynamical state. The relation between triaxial cluster masses at various overdensities provides a promising alternative to the strong lensing test.

  4. Models and Methods for Adaptive Management of Individual and Team-Based Training Using a Simulator

    Science.gov (United States)

    Lisitsyna, L. S.; Smetyuh, N. P.; Golikov, S. P.

    2017-05-01

    Research of adaptive individual and team-based training has been analyzed and helped find out that both in Russia and abroad, individual and team-based training and retraining of AASTM operators usually includes: production training, training of general computer and office equipment skills, simulator training including virtual simulators which use computers to simulate real-world manufacturing situation, and, as a rule, the evaluation of AASTM operators’ knowledge determined by completeness and adequacy of their actions under the simulated conditions. Such approach to training and re-training of AASTM operators stipulates only technical training of operators and testing their knowledge based on assessing their actions in a simulated environment.

  5. A Simulation of Readiness-Based Sparing Policies

    Science.gov (United States)

    2017-06-01

    variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...WS Type with 22 Individual WS at a Representative Site.....................................................................................31

  6. Mesoscale meteorological model based on radioactive explosion cloud simulation

    International Nuclear Information System (INIS)

    Zheng Yi; Zhang Yan; Ying Chuntong

    2008-01-01

    In order to simulate nuclear explosion and dirty bomb radioactive cloud movement and concentration distribution, mesoscale meteorological model RAMS was used. Particles-size, size-active distribution and gravitational fallout in the cloud were considered. The results show that the model can simulate the 'mushroom' clouds of explosion. Three-dimension fluid field and radioactive concentration field were received. (authors)

  7. Event-based simulation of networks with pulse delayed coupling

    Science.gov (United States)

    Klinshov, Vladimir; Nekorkin, Vladimir

    2017-10-01

    Pulse-mediated interactions are common in networks of different nature. Here we develop a general framework for simulation of networks with pulse delayed coupling. We introduce the discrete map governing the dynamics of such networks and describe the computation algorithm for its numerical simulation.

  8. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    Science.gov (United States)

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  9. Simulation Based Acquisition for NASA's Office of Exploration Systems

    Science.gov (United States)

    Hale, Joe

    2004-01-01

    In January 2004, President George W. Bush unveiled his vision for NASA to advance U.S. scientific, security, and economic interests through a robust space exploration program. This vision includes the goal to extend human presence across the solar system, starting with a human return to the Moon no later than 2020, in preparation for human exploration of Mars and other destinations. In response to this vision, NASA has created the Office of Exploration Systems (OExS) to develop the innovative technologies, knowledge, and infrastructures to explore and support decisions about human exploration destinations, including the development of a new Crew Exploration Vehicle (CEV). Within the OExS organization, NASA is implementing Simulation Based Acquisition (SBA), a robust Modeling & Simulation (M&S) environment integrated across all acquisition phases and programs/teams, to make the realization of the President s vision more certain. Executed properly, SBA will foster better informed, timelier, and more defensible decisions throughout the acquisition life cycle. By doing so, SBA will improve the quality of NASA systems and speed their development, at less cost and risk than would otherwise be the case. SBA is a comprehensive, Enterprise-wide endeavor that necessitates an evolved culture, a revised spiral acquisition process, and an infrastructure of advanced Information Technology (IT) capabilities. SBA encompasses all project phases (from requirements analysis and concept formulation through design, manufacture, training, and operations), professional disciplines, and activities that can benefit from employing SBA capabilities. SBA capabilities include: developing and assessing system concepts and designs; planning manufacturing, assembly, transport, and launch; training crews, maintainers, launch personnel, and controllers; planning and monitoring missions; responding to emergencies by evaluating effects and exploring solutions; and communicating across the OEx

  10. Integrated development of light armored vehicles based on wargaming simulators

    Science.gov (United States)

    Palmarini, Marc; Rapanotti, John

    2004-08-01

    Vehicles are evolving into vehicle networks through improved sensors, computers and communications. Unless carefully planned, these complex systems can result in excessive crew workload and difficulty in optimizing the use of the vehicle. To overcome these problems, a war-gaming simulator is being developed as a common platform to integrate contributions from three different groups. The simulator, OneSAF, is used to integrate simplified models of technology and natural phenomena from scientists and engineers with tactics and doctrine from the military and analyzed in detail by operations analysts. This approach ensures the modelling of processes known to be important regardless of the level of information available about the system. Vehicle survivability can be improved as well with better sensors, computers and countermeasures to detect and avoid or destroy threats. To improve threat detection and reliability, Defensive Aids Suite (DAS) designs are based on three complementary sensor technologies including: acoustics, visible and infrared optics and radar. Both active armour and softkill countermeasures are considered. In a typical scenario, a search radar, providing continuous hemispherical coverage, detects and classifies the threat and cues a tracking radar. Data from the tracking radar is processed and an explosive grenade is launched to destroy or deflect the threat. The angle of attack and velocity from the search radar can be used by the soft-kill system to carry out an infrared search and track or an illuminated range-gated scan for the threat platform. Upon detection, obscuration, countermanoeuvres and counterfire can be used against the threat. The sensor suite is completed by acoustic detection of muzzle blast and shock waves. Automation and networking at the platoon level contribute to improved vehicle survivability. Sensor data fusion is essential in avoiding catastrophic failure of the DAS. The modular DAS components can be used with Light Armoured

  11. Comparison of Flight Simulators Based on Human Motion Perception Metrics

    Science.gov (United States)

    Valente Pais, Ana R.; Correia Gracio, Bruno J.; Kelly, Lon C.; Houck, Jacob A.

    2015-01-01

    In flight simulation, motion filters are used to transform aircraft motion into simulator motion. When looking for the best match between visual and inertial amplitude in a simulator, researchers have found that there is a range of inertial amplitudes, rather than a single inertial value, that is perceived by subjects as optimal. This zone, hereafter referred to as the optimal zone, seems to correlate to the perceptual coherence zones measured in flight simulators. However, no studies were found in which these two zones were compared. This study investigates the relation between the optimal and the coherence zone measurements within and between different simulators. Results show that for the sway axis, the optimal zone lies within the lower part of the coherence zone. In addition, it was found that, whereas the width of the coherence zone depends on the visual amplitude and frequency, the width of the optimal zone remains constant.

  12. Low-level tank waste simulant data base

    International Nuclear Information System (INIS)

    Lokken, R.O.

    1996-04-01

    The majority of defense wastes generated from reprocessing spent N- Reactor fuel at Hanford are stored in underground Double-shell Tanks (DST) and in older Single-Shell Tanks (SST) in the form of liquids, slurries, sludges, and salt cakes. The tank waste remediation System (TWRS) Program has the responsibility of safely managing and immobilizing these tank wastes for disposal. This report discusses three principle topics: the need for and basis for selecting target or reference LLW simulants, tanks waste analyses and simulants that have been defined, developed, and used for the GDP and activities in support of preparing and characterizing simulants for the current LLW vitrification project. The procedures and the data that were generated to characterized the LLW vitrification simulants were reported and are presented in this report. The final section of this report addresses the applicability of the data to the current program and presents recommendations for additional data needs including characterization and simulant compositional variability studies

  13. NMR diffusion simulation based on conditional random walk.

    Science.gov (United States)

    Gudbjartsson, H; Patz, S

    1995-01-01

    The authors introduce here a new, very fast, simulation method for free diffusion in a linear magnetic field gradient, which is an extension of the conventional Monte Carlo (MC) method or the convolution method described by Wong et al. (in 12th SMRM, New York, 1993, p.10). In earlier NMR-diffusion simulation methods, such as the finite difference method (FD), the Monte Carlo method, and the deterministic convolution method, the outcome of the calculations depends on the simulation time step. In the authors' method, however, the results are independent of the time step, although, in the convolution method the step size has to be adequate for spins to diffuse to adjacent grid points. By always selecting the largest possible time step the computation time can therefore be reduced. Finally the authors point out that in simple geometric configurations their simulation algorithm can be used to reduce computation time in the simulation of restricted diffusion.

  14. Hemodynamic effects of microgravity and their ground-based simulations

    Science.gov (United States)

    Lobachik, V. I.; Abrosimov, S. V.; Zhidkov, V. V.; Endeka, D. K.

    Hemodynamic effects of simulated microgravity were investigated, in various experiments, using radioactive isotopes, in which 40 healthy men, aged 35 to 42 years, took part. Blood shifts were evaluated qualitatively and quantitatively. Simulation studies included bedrest, head-down tilt (-5° and -15°), and vertical water immersion, it was found that none of the methods could entirely simulate hemodynamic effects of microgravity. Subjective sensations varied in a wide range. They cannot be used to identify reliably the effects of real and simulated microgravity. Renal fluid excretion in real and simulated microgravity was different in terms of volume and time. The experiments yielded data about the general pattern of circulation with blood displaced to the upper body.

  15. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  16. Simulation-Based Performance Evaluation of Predictive-Hashing Based Multicast Authentication Protocol

    Directory of Open Access Journals (Sweden)

    Seonho Choi

    2012-12-01

    Full Text Available A predictive-hashing based Denial-of-Service (DoS resistant multicast authentication protocol was proposed based upon predictive-hashing, one-way key chain, erasure codes, and distillation codes techniques [4, 5]. It was claimed that this new scheme should be more resistant to various types of DoS attacks, and its worst-case resource requirements were derived in terms of coarse-level system parameters including CPU times for signature verification and erasure/distillation decoding operations, attack levels, etc. To show the effectiveness of our approach and to analyze exact resource requirements in various attack scenarios with different parameter settings, we designed and implemented an attack simulator which is platformindependent. Various attack scenarios may be created with different attack types and parameters against a receiver equipped with the predictive-hashing based protocol. The design of the simulator is explained, and the simulation results are presented with detailed resource usage statistics. In addition, resistance level to various types of DoS attacks is formulated with a newly defined resistance metric. By comparing these results to those from another approach, PRABS [8], we show that the resistance level of our protocol is greatly enhanced even in the presence of many attack streams.

  17. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  18. Serious games experiment toward agent-based simulation

    Science.gov (United States)

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  19. Simulation based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  20. Physics validation of detector simulation tools for LHC

    International Nuclear Information System (INIS)

    Beringer, J.

    2004-01-01

    Extensive studies aimed at validating the physics processes built into the detector simulation tools Geant4 and Fluka are in progress within all Large Hardon Collider (LHC) experiments, within the collaborations developing these tools, and within the LHC Computing Grid (LCG) Simulation Physics Validation Project, which has become the primary forum for these activities. This work includes detailed comparisons with test beam data, as well as benchmark studies of simple geometries and materials with single incident particles of various energies for which experimental data is available. We give an overview of these validation activities with emphasis on the latest results

  1. Web-Based Distributed Simulation of Aeronautical Propulsion System

    Science.gov (United States)

    Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac

    2001-01-01

    An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.

  2. Computer based training simulator for Hunterston Nuclear Power Station

    International Nuclear Information System (INIS)

    Bowden, R.S.M.; Hacking, D.

    1978-01-01

    For reasons which are stated, the Hunterston-B nuclear power station automatic control system includes a manual over-ride facility. It is therefore essential for the station engineers to be trained to recognise and control all feasible modes of plant and logic malfunction. A training simulator has been built which consists of a replica of the shutdown monitoring panel in the Central Control Room and is controlled by a mini-computer. This paper highlights the computer aspects of the simulator and relevant derived experience, under the following headings: engineering background; shutdown sequence equipment; simulator equipment; features; software; testing; maintenance. (U.K.)

  3. REVIEW OF FLEXIBLE MANUFACTURING SYSTEM BASED ON MODELING AND SIMULATION

    Directory of Open Access Journals (Sweden)

    SAREN Sanjib Kumar

    2016-05-01

    Full Text Available This paper focused on the literature survey of the use of flexible manufacturing system design and operation problems on the basis of simulation tools and their methodology which has been widely used for manufacturing system design and analysis. During this period, simulation has been proving to be an extremely useful analysis and optimization tool, and many articles, papers, and conferences have focused directly on the topic. This paper presents a scenario the use of simulation tools and their methodology in flexible manufacturing system from a period 1982 to 2015.

  4. Laser-wakefield accelerators for medical phase contrast imaging: Monte Carlo simulations and experimental studies

    Science.gov (United States)

    Cipiccia, S.; Reboredo, D.; Vittoria, Fabio A.; Welsh, G. H.; Grant, P.; Grant, D. W.; Brunetti, E.; Wiggins, S. M.; Olivo, A.; Jaroszynski, D. A.

    2015-05-01

    X-ray phase contrast imaging (X-PCi) is a very promising method of dramatically enhancing the contrast of X-ray images of microscopic weakly absorbing objects and soft tissue, which may lead to significant advancement in medical imaging with high-resolution and low-dose. The interest in X-PCi is giving rise to a demand for effective simulation methods. Monte Carlo codes have been proved a valuable tool for studying X-PCi including coherent effects. The laser-plasma wakefield accelerators (LWFA) is a very compact particle accelerator that uses plasma as an accelerating medium. Accelerating gradient in excess of 1 GV/cm can be obtained, which makes them over a thousand times more compact than conventional accelerators. LWFA are also sources of brilliant betatron radiation, which are promising for applications including medical imaging. We present a study that explores the potential of LWFA-based betatron sources for medical X-PCi and investigate its resolution limit using numerical simulations based on the FLUKA Monte Carlo code, and present preliminary experimental results.

  5. Plan Validation Using DES and Agent-based Simulation

    National Research Council Canada - National Science Library

    Wong, Teck H; Ong, Kim S

    2008-01-01

    .... This thesis explores the possibility of using a multi-agent system (MAS) to generate the aggressor's air strike plans, which could be coupled with a low resolution Discrete Event Simulation (DES...

  6. Nonlinear genetic-based simulation of soil shear strength parameters

    Indian Academy of Sciences (India)

    stress and the excess pore water pressure. If the pore water pressures are measured during the test, the effective ..... A sedimentation test was carried out throughout a hydrometer ...... a C/C++ Simulation Model of a Waste Incinerator Sci-.

  7. Validation techniques of agent based modelling for geospatial simulations

    OpenAIRE

    Darvishi, M.; Ahmadi, G.

    2014-01-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent...

  8. Systematic uncertainties on Monte Carlo simulation of lead based ADS

    International Nuclear Information System (INIS)

    Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.

    1999-01-01

    Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)

  9. SELF-ABSORPTION CORRECTIONS BASED ON MONTE CARLO SIMULATIONS

    Directory of Open Access Journals (Sweden)

    Kamila Johnová

    2016-12-01

    Full Text Available The main aim of this article is to demonstrate how Monte Carlo simulations are implemented in our gamma spectrometry laboratory at the Department of Dosimetry and Application of Ionizing Radiation in order to calculate the self-absorption within the samples. A model of real HPGe detector created for MCNP simulations is presented in this paper. All of the possible parameters, which may influence the self-absorption, are at first discussed theoretically and lately described using the calculated results.

  10. SEAWAT-based simulation of axisymmetric heat transport.

    Science.gov (United States)

    Vandenbohede, Alexander; Louwyck, Andy; Vlamynck, Nele

    2014-01-01

    Simulation of heat transport has its applications in geothermal exploitation of aquifers and the analysis of temperature dependent chemical reactions. Under homogeneous conditions and in the absence of a regional hydraulic gradient, groundwater flow and heat transport from or to a well exhibit radial symmetry, and governing equations are reduced by one dimension (1D) which increases computational efficiency importantly. Solute transport codes can simulate heat transport and input parameters may be modified such that the Cartesian geometry can handle radial flow. In this article, SEAWAT is evaluated as simulator for heat transport under radial flow conditions. The 1971, 1D analytical solution of Gelhar and Collins is used to compare axisymmetric transport with retardation (i.e., as a result of thermal equilibrium between fluid and solid) and a large diffusion (conduction). It is shown that an axisymmetric simulation compares well with a fully three dimensional (3D) simulation of an aquifer thermal energy storage systems. The influence of grid discretization, solver parameters, and advection solution is illustrated. Because of the high diffusion to simulate conduction, convergence criterion for heat transport must be set much smaller (10(-10) ) than for solute transport (10(-6) ). Grid discretization should be considered carefully, in particular the subdivision of the screen interval. On the other hand, different methods to calculate the pumping or injection rate distribution over different nodes of a multilayer well lead to small differences only. © 2013, National Ground Water Association.

  11. A review of virtual reality based training simulators for orthopaedic surgery

    OpenAIRE

    Vaughan, Neil; Dubey, Venketesh N.; Wainwright, Tom; Middleton, Robert

    2015-01-01

    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 total hip replacement pre-operative planning tools were analysed, plus 9 hip trauma fracture tr...

  12. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Science.gov (United States)

    Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred

    2018-01-01

    Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  13. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Directory of Open Access Journals (Sweden)

    R. Raj

    2018-01-01

    Full Text Available Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT, ratio of fine root carbon to leaf carbon (FRC : LC, ratio of carbon to nitrogen in leaf (C : Nleaf, canopy water interception coefficient (Wint, fraction of leaf nitrogen in RuBisCO (FLNR, and effective soil rooting depth (SD characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash–Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  14. Simulation in the Internet age: the place of web-based simulation in nursing education. An integrative review.

    Science.gov (United States)

    Cant, Robyn P; Cooper, Simon J

    2014-12-01

    The objective of this article was to review the literature on utilisation and place of Web-based simulation within nursing education. Web-based simulation combines electronic multimedia options with a central video or virtual world to produce interactive learning activities mediated by the learner. An integrative review. A search was conducted of healthcare databases between 2000 and 2014 and of Internet sources for hosted simulation programs in nursing. Eighteen primary programs were identified for inclusion. A strategy for integrative review was adopted in which studies were identified, filtered, classified, analysed and compared. Of 18 programs, two game-based programs were identified which represented a 'virtual world' in which students could simultaneously or individually immerse themselves in a character role-play. However, most programs (n=10) taught an aspect of procedural patient care using multimedia (e.g. video, audio, graphics, quiz, text, memo). Time-limited sequences, feedback and reflective activities were often incorporated. Other studies (n=8) taught interpersonal communication skills or technical skills for equipment use. Descriptive study outcomes indicated ease of program use, strong satisfaction with learning and appreciation of program accessibility. Additionally, four studies reported significant improvements in knowledge post-intervention. Web-based simulation is highly acceptable to students and appears to provide learning benefits that align with other simulation approaches and it augments face-to-face teaching. Web-based simulation is likely to have a major place in nursing curricula in the next decade, yet further research is necessary to objectively evaluate learner outcomes and to justify its use. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Fluid, solid and fluid-structure interaction simulations on patient-based abdominal aortic aneurysm models.

    Science.gov (United States)

    Kelly, Sinead; O'Rourke, Malachy

    2012-04-01

    This article describes the use of fluid, solid and fluid-structure interaction simulations on three patient-based abdominal aortic aneurysm geometries. All simulations were carried out using OpenFOAM, which uses the finite volume method to solve both fluid and solid equations. Initially a fluid-only simulation was carried out on a single patient-based geometry and results from this simulation were compared with experimental results. There was good qualitative and quantitative agreement between the experimental and numerical results, suggesting that OpenFOAM is capable of predicting the main features of unsteady flow through a complex patient-based abdominal aortic aneurysm geometry. The intraluminal thrombus and arterial wall were then included, and solid stress and fluid-structure interaction simulations were performed on this, and two other patient-based abdominal aortic aneurysm geometries. It was found that the solid stress simulations resulted in an under-estimation of the maximum stress by up to 5.9% when compared with the fluid-structure interaction simulations. In the fluid-structure interaction simulations, flow induced pressure within the aneurysm was found to be up to 4.8% higher than the value of peak systolic pressure imposed in the solid stress simulations, which is likely to be the cause of the variation in the stress results. In comparing the results from the initial fluid-only simulation with results from the fluid-structure interaction simulation on the same patient, it was found that wall shear stress values varied by up to 35% between the two simulation methods. It was concluded that solid stress simulations are adequate to predict the maximum stress in an aneurysm wall, while fluid-structure interaction simulations should be performed if accurate prediction of the fluid wall shear stress is necessary. Therefore, the decision to perform fluid-structure interaction simulations should be based on the particular variables of interest in a given

  16. Simulation-Based Testing of Pager Interruptions During Laparoscopic Cholecystectomy.

    Science.gov (United States)

    Sujka, Joseph A; Safcsak, Karen; Bhullar, Indermeet S; Havron, William S

    2018-01-30

    To determine if pager interruptions affect operative time, safety, or complications and management of pager issues during a simulated laparoscopic cholecystectomy. Twelve surgery resident volunteers were tested on a Simbionix Lap Mentor II simulator. Each resident performed 6 randomized simulated laparoscopic cholecystectomies; 3 with pager interruptions (INT) and 3 without pager interruptions (NO-INT). The pager interruptions were sent in the form of standardized patient vignettes and timed to distract the resident during dissection of the critical view of safety and clipping of the cystic duct. The residents were graded on a pass/fail scale for eliciting appropriate patient history and management of the pager issue. Data was extracted from the simulator for the following endpoints: operative time, safety metrics, and incidence of operative complications. The Mann-Whitney U test and contingency table analysis were used to compare the 2 groups (INT vs. NO-INT). Level I trauma center; Simulation laboratory. Twelve general surgery residents. There was no significant difference between the 2 groups in any of the operative endpoints as measured by the simulator. However, in the INT group, only 25% of the time did the surgery residents both adequately address the issue and provide effective patient management in response to the pager interruption. Pager interruptions did not affect operative time, safety, or complications during the simulated procedure. However, there were significant failures in the appropriate evaluations and management of pager issues. Consideration for diversion of patient care issues to fellow residents not operating to improve quality and safety of patient care outside the operating room requires further study. Copyright © 2018. Published by Elsevier Inc.

  17. Learning curves and long-term outcome of simulation-based thoracentesis training for medical students

    Science.gov (United States)

    2011-01-01

    Background Simulation-based medical education has been widely used in medical skills training; however, the effectiveness and long-term outcome of simulation-based training in thoracentesis requires further investigation. The purpose of this study was to assess the learning curve of simulation-based thoracentesis training, study skills retention and transfer of knowledge to a clinical setting following simulation-based education intervention in thoracentesis procedures. Methods Fifty-two medical students were enrolled in this study. Each participant performed five supervised trials on the simulator. Participant's performance was assessed by performance score (PS), procedure time (PT), and participant's confidence (PC). Learning curves for each variable were generated. Long-term outcome of the training was measured by the retesting and clinical performance evaluation 6 months and 1 year, respectively, after initial training on the simulator. Results Significant improvements in PS, PT, and PC were noted among the first 3 to 4 test trials (p 0.05). Clinical competency in thoracentesis was improved in participants who received simulation training relative to that of first year medical residents without such experience (p simulation-based thoracentesis training can significantly improve an individual's performance. The saturation of learning from the simulator can be achieved after four practice sessions. Simulation-based training can assist in long-term retention of skills and can be partially transferred to clinical practice. PMID:21696584

  18. Simulation-based point-of-care ultrasound training

    DEFF Research Database (Denmark)

    Jensen, J K; Dyre, L; Jørgensen, M E

    2018-01-01

    before being performed on actual patients. The aim of this study was to investigate the learning curves for novices training the FAST protocol on a virtual-reality simulator. METHODS: Ultrasound novices (N = 25) were instructed to complete a FAST training program on a virtual-reality ultrasound simulator....... Participants were instructed to continue training until they reached a previously established mastery learning level, which corresponds to the performance level of a group of ultrasound experts. Performance scores and time used during each FAST examination were used to determine participants' learning curves....... RESULTS: The participants attained the mastery learning level within a median of three (range two to four) attempts corresponding to a median of 1 h 46 min (range 1 h 2 min to 3 h 37 min) of simulation training. The ultrasound novices' examination speed improved significantly with training, and continued...

  19. Hybrid method based on embedded coupled simulation of vortex particles in grid based solution

    Science.gov (United States)

    Kornev, Nikolai

    2017-09-01

    The paper presents a novel hybrid approach developed to improve the resolution of concentrated vortices in computational fluid mechanics. The method is based on combination of a grid based and the grid free computational vortex (CVM) methods. The large scale flow structures are simulated on the grid whereas the concentrated structures are modeled using CVM. Due to this combination the advantages of both methods are strengthened whereas the disadvantages are diminished. The procedure of the separation of small concentrated vortices from the large scale ones is based on LES filtering idea. The flow dynamics is governed by two coupled transport equations taking two-way interaction between large and fine structures into account. The fine structures are mapped back to the grid if their size grows due to diffusion. Algorithmic aspects of the hybrid method are discussed. Advantages of the new approach are illustrated on some simple two dimensional canonical flows containing concentrated vortices.

  20. Measurement and simulation of the neutron detection efficiency with a Pb-scintillating fiber calorimeter

    Energy Technology Data Exchange (ETDEWEB)

    Anelli, M; Bertolucci, S; Curceanu, C; Giovannella, S; Happacher, F; Iliescu, M; Martini, M; Miscetti, S [Laboratori Nazionali di Frascati, INFN (Italy); Battistoni, G [Sezione INFN di Milano (Italy); Bini, C; Zorzi, G De; Domenico, Adi; Gauzzi, P [Ubiversita degli Studi ' La Sapienza' e Sezine INFN di Roma (Italy); Branchini, P; Micco, B Di; Ngugen, F; Paseri, A [Universita degli di Studi ' Roma Tre' e Sezione INFN di Roma Tre (Italy); Ferrari, A [Fondazione CNAO, Milano (Italy); Prokfiev, A [Svedberg Laboratory, Uppsala University (Sweden); Fiore, S, E-mail: matteo.martino@inf.infn.i

    2009-04-01

    We have measured the overall detection efficiency of a small prototype of the KLOE PB-scintillation fiber calorimeter to neutrons with kinetic energy range [5,175] MeV. The measurement has been done in a dedicated test beam in the neutron beam facility of the Svedberg Laboratory, TSL Uppsala. The measurements of the neutron detection efficiency of a NE110 scintillator provided a reference calibration. At the lowest trigger threshold, the overall calorimeter efficiency ranges from 28% to 33%. This value largely exceeds the estimated {approx}8% expected if the response were proportional only to the scintillator equivalent thickness. A detailed simulation of the calorimeter and of the TSL beam line has been performed with the FLUKA Monte Carlo code. The simulated response of the detector to neutrons is presented together with the first data to Monte Carlo comparison. The results show an overall neutron efficiency of about 35%. The reasons for such an efficiency enhancement, in comparison with the typical scintillator-based neutron counters, are explained, opening the road to a novel neutron detector.

  1. ModelforAnalyzing Human Communication Network Based onAgent-Based Simulation

    Science.gov (United States)

    Matsuyama, Shinako; Terano, Takao

    This paper discusses dynamic properties of human communications networks, which appears as a result of informationexchanges among people. We propose agent-based simulation (ABS) to examine implicit mechanisms behind the dynamics. The ABS enables us to reveal the characteristics and the differences of the networks regarding the specific communicationgroups. We perform experiments on the ABS with activity data from questionnaires survey and with virtual data which isdifferent from the activity data. We compare the difference between them and show the effectiveness of the ABS through theexperiments.

  2. Simulation Tools for Power Electronics Courses Based on Java Technologies

    Science.gov (United States)

    Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.

    2010-01-01

    This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…

  3. Simulaser, a graphical laser simulator based on Matlab Simulink

    CSIR Research Space (South Africa)

    Jacobs, Cobus

    2016-07-01

    Full Text Available We present a single-element plane-wave laser rate equation model and its implementation as a graphical laser simulation library using Matlab Simulink. Simulink’s graphical interface and vector capabilities provide a unique layer of abstraction...

  4. Simulation-based optimization for product and process design

    NARCIS (Netherlands)

    Driessen, L.

    2006-01-01

    The design of products and processes has gradually shifted from a purely physical process towards a process that heavily relies on computer simulations (virtual prototyping). To optimize this virtual design process in terms of speed and final product quality, statistical methods and mathematical

  5. A simulation based engineering method to support HAZOP studies

    DEFF Research Database (Denmark)

    Enemark-Rasmussen, Rasmus; Cameron, David; Angelo, Per Bagge

    2012-01-01

    the conventional HAZOP procedure. The method systematically generates failure scenarios by considering process equipment deviations with pre-defined failure modes. The effect of failure scenarios is then evaluated using dynamic simulations -in this study the K-Spice® software used. The consequences of each failure...

  6. Physics-based simulation models for EBSD: advances and challenges

    Science.gov (United States)

    Winkelmann, A.; Nolze, G.; Vos, M.; Salvat-Pujol, F.; Werner, W. S. M.

    2016-02-01

    EBSD has evolved into an effective tool for microstructure investigations in the scanning electron microscope. The purpose of this contribution is to give an overview of various simulation approaches for EBSD Kikuchi patterns and to discuss some of the underlying physical mechanisms.

  7. The Simulation-Based Assessment of Pediatric Rapid Response Teams.

    Science.gov (United States)

    Fehr, James J; McBride, Mary E; Boulet, John R; Murray, David J

    2017-09-01

    To create scenarios of simulated decompensating pediatric patients to train pediatric rapid response teams (RRTs) and to determine whether the scenario scores provide a valid assessment of RRT performance with the hypothesis that RRTs led by intensivists-in-training would be better prepared to manage the scenarios than teams led by nurse practitioners. A set of 10 simulated scenarios was designed for the training and assessment of pediatric RRTs. Pediatric RRTs, comprising a pediatric intensive care unit (PICU) registered nurse and respiratory therapist, led by a PICU intensivist-in-training or a pediatric nurse practitioner, managed 7 simulated acutely decompensating patients. Two raters evaluated the scenario performances and psychometric analyses of the scenarios were performed. The teams readily managed scenarios such as supraventricular tachycardia and opioid overdose but had difficulty with more complicated scenarios such as aortic coarctation or head injury. The management of any particular scenario was reasonably predictive of overall team performance. The teams led by the PICU intensivists-in-training outperformed the teams led by the pediatric nurse practitioners. Simulation provides a method for RRTs to develop decision-making skills in managing decompensating pediatric patients. The multiple scenario assessment provided a moderately reliable team score. The greater scores achieved by PICU intensivist-in-training-led teams provides some evidence to support the validity of the assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Metamodel-based robust simulation-optimization : An overview

    NARCIS (Netherlands)

    Dellino, G.; Meloni, C.; Kleijnen, J.P.C.; Dellino, Gabriella; Meloni, Carlo

    2015-01-01

    Optimization of simulated systems is the goal of many methods, but most methods assume known environments. We, however, develop a "robust" methodology that accounts for uncertain environments. Our methodology uses Taguchi's view of the uncertain world but replaces his statistical techniques by

  9. Micro-simulation based analysis of railway lines robustness

    DEFF Research Database (Denmark)

    Cerreto, Fabrizio

    2015-01-01

    train course that can represent the total train delay along the line. Dispatching measures in case of disturbance are excluded by applying First-In First-Out rule in simulations. The benefits of modifications to the track infrastructure, the timetable, and the signaling system, in terms of consecutive...

  10. Simulation

    CERN Document Server

    Ross, Sheldon

    2006-01-01

    Ross's Simulation, Fourth Edition introduces aspiring and practicing actuaries, engineers, computer scientists and others to the practical aspects of constructing computerized simulation studies to analyze and interpret real phenomena. Readers learn to apply results of these analyses to problems in a wide variety of fields to obtain effective, accurate solutions and make predictions about future outcomes. This text explains how a computer can be used to generate random numbers, and how to use these random numbers to generate the behavior of a stochastic model over time. It presents the statist

  11. Simulation-Based Performance Assessment: An Innovative Approach to Exploring Understanding of Physical Science Concepts

    Science.gov (United States)

    Gale, Jessica; Wind, Stefanie; Koval, Jayma; Dagosta, Joseph; Ryan, Mike; Usselman, Marion

    2016-01-01

    This paper illustrates the use of simulation-based performance assessment (PA) methodology in a recent study of eighth-grade students' understanding of physical science concepts. A set of four simulation-based PA tasks were iteratively developed to assess student understanding of an array of physical science concepts, including net force,…

  12. Discovery Learning, Representation, and Explanation within a Computer-Based Simulation: Finding the Right Mix

    Science.gov (United States)

    Rieber, Lloyd P.; Tzeng, Shyh-Chii; Tribble, Kelly

    2004-01-01

    The purpose of this research was to explore how adult users interact and learn during an interactive computer-based simulation supplemented with brief multimedia explanations of the content. A total of 52 college students interacted with a computer-based simulation of Newton's laws of motion in which they had control over the motion of a simple…

  13. Developing Clinical Competency in Crisis Event Management: An Integrated Simulation Problem-Based Learning Activity

    Science.gov (United States)

    Liaw, S. Y.; Chen, F. G.; Klainin, P.; Brammer, J.; O'Brien, A.; Samarasekera, D. D.

    2010-01-01

    This study aimed to evaluate the integration of a simulation based learning activity on nursing students' clinical crisis management performance in a problem-based learning (PBL) curriculum. It was hypothesized that the clinical performance of first year nursing students who participated in a simulated learning activity during the PBL session…

  14. Motivational Effect of Web-Based Simulation Game in Teaching Operations Management

    Science.gov (United States)

    Nguyen, Tung Nhu

    2015-01-01

    Motivational effects during a simulated educational game should be studied because a general concern of lecturers is motivating students and increasing their knowledge. Given advances in internet technology, traditional short in-class games are being substituted with long web-based games. To maximize the benefits of web-based simulation games, a…

  15. Toward Simulating Realistic Pursuit-Evasion Using a Roadmap-Based Approach

    KAUST Repository

    Rodriguez, Samuel; Denny, Jory; Zourntos, Takis; Amato, Nancy M.

    2010-01-01

    In this work, we describe an approach for modeling and simulating group behaviors for pursuit-evasion that uses a graph-based representation of the environment and integrates multi-agent simulation with roadmap-based path planning. We demonstrate

  16. Simulation-based training for nurses: Systematic review and meta-analysis.

    Science.gov (United States)

    Hegland, Pål A; Aarlie, Hege; Strømme, Hilde; Jamtvedt, Gro

    2017-07-01

    Simulation-based training is a widespread strategy to improve health-care quality. However, its effect on registered nurses has previously not been established in systematic reviews. The aim of this systematic review is to evaluate effect of simulation-based training on nurses' skills and knowledge. We searched CDSR, DARE, HTA, CENTRAL, CINAHL, MEDLINE, Embase, ERIC, and SveMed+ for randomised controlled trials (RCT) evaluating effect of simulation-based training among nurses. Searches were completed in December 2016. Two reviewers independently screened abstracts and full-text, extracted data, and assessed risk of bias. We compared simulation-based training to other learning strategies, high-fidelity simulation to other simulation strategies, and different organisation of simulation training. Data were analysed through meta-analysis and narrative syntheses. GRADE was used to assess the quality of evidence. Fifteen RCTs met the inclusion criteria. For the comparison of simulation-based training to other learning strategies on nurses' skills, six studies in the meta-analysis showed a significant, but small effect in favour of simulation (SMD -1.09, CI -1.72 to -0.47). There was large heterogeneity (I 2 85%). For the other comparisons, there was large between-study variation in results. The quality of evidence for all comparisons was graded as low. The effect of simulation-based training varies substantially between studies. Our meta-analysis showed a significant effect of simulation training compared to other learning strategies, but the quality of evidence was low indicating uncertainty. Other comparisons showed inconsistency in results. Based on our findings simulation training appears to be an effective strategy to improve nurses' skills, but further good-quality RCTs with adequate sample sizes are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A review of virtual reality based training simulators for orthopaedic surgery.

    Science.gov (United States)

    Vaughan, Neil; Dubey, Venketesh N; Wainwright, Thomas W; Middleton, Robert G

    2016-02-01

    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 hip replacements pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Towards a Game-Based Periscope Simulator for Submarine Officers Tactical Training

    Science.gov (United States)

    2016-06-01

    ONLY 2. REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE TOWARDS A GAME -BASED PERISCOPE SIMULATOR...career to learn and practice these skills. Following an instructional system design process, this thesis developed a 3D, game -based periscope tactical...experience. Results of this thesis support the use of game -based simulation as training tools and that feedback type could be tailored to individuals based

  19. Simulation based design strategy for EMC compliance of components in hybrid vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Maass, Uwe; Ndip, Ivan; Hoene, Eckard; Guttowski, Stephan [Fraunhofer-Institut fuer Zuverlaessigkeit und Mikrointegration (IZM), Berlin (Germany); Tschoban, Christian; Lang, Klaus-Dieter [Technische Univ. Berlin (Germany)

    2012-11-01

    The design of components for the power train of hybrid vehicles needs to take into account EMC compliance standards related to hazardous electromagnetic fields. Using a simulation based design strategy allows for virtual EMC tests in parallel to the mechanical / electrical power design and thus reduces (re-)design time and costs. Taking as an example a high-voltage battery for a hybrid vehicle the emitted magnetic fields outside the battery are examined. The simulation stategy is based on 3D EM simulations using a full-wave and an eddy current solver. The simulation models are based on the actual CAD data from the mechanical construction resulting in and a high geometrical aspect ratio. The impact of simulation specific aspects such as boundary conditions and excitation is given. It was found that using field simulations it is possible to identify noise sources and coupling paths as well as aid the construction of the battery. (orig.)

  20. Using Simulation-Based Medical Education to Meet the Competency Requirements for the Single Accreditation System.

    Science.gov (United States)

    Riley, Bernadette

    2015-08-01

    Simulation-based medical education can provide medical training in a nonjudgmental, patient-safe, and effective environment. Although simulation has been a relatively new addition to medical education, the aeronautical, judicial, and military fields have used simulation training for hundreds of years, with positive outcomes. Simulation-based medical education can be used in a variety of settings, such as hospitals, outpatient clinics, medical schools, and simulation training centers. As the author describes in the present article, residencies currently accredited by the American Osteopathic Association can use a simulation-based medical education curriculum to meet training requirements of the 6 competencies identified by the Accreditation Council for Graduate Medical Education. The author also provides specific guidance on providing training and assessment in the professionalism competency.

  1. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass

    Directory of Open Access Journals (Sweden)

    Jesús M. Sánchez

    2016-08-01

    Full Text Available Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones with an agent-based social simulator and indoor tracking services.

  2. Current Status of Simulation-Based Training in Graduate Medical Education.

    Science.gov (United States)

    Willis, Ross E; Van Sickle, Kent R

    2015-08-01

    The use of simulation in Graduate Medical Education has evolved significantly over time, particularly during the past decade. The applications of simulation include introductory and basic technical skills, more advanced technical skills, and nontechnical skills, and simulation is gaining acceptance in high-stakes assessments. Simulation-based training has also brought about paradigm shifts in the medical and surgical education arenas and has borne new and exciting national and local consortia that will ensure that the scope and impact of simulation will continue to broaden. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. A General Simulator for Acid-Base Titrations

    Science.gov (United States)

    de Levie, Robert

    1999-07-01

    General formal expressions are provided to facilitate the automatic computer calculation of acid-base titration curves of arbitrary mixtures of acids, bases, and salts, without and with activity corrections based on the Davies equation. Explicit relations are also given for the buffer strength of mixtures of acids, bases, and salts.

  4. Accurate lithography simulation model based on convolutional neural networks

    Science.gov (United States)

    Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki

    2017-07-01

    Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.

  5. CASTING IMPROVEMENT BASED ON METAHEURISTIC OPTIMIZATION AND NUMERICAL SIMULATION

    Directory of Open Access Journals (Sweden)

    Radomir Radiša

    2017-12-01

    Full Text Available This paper presents the use of metaheuristic optimization techniques to support the improvement of casting process. Genetic algorithm (GA, Ant Colony Optimization (ACO, Simulated annealing (SA and Particle Swarm Optimization (PSO have been considered as optimization tools to define the geometry of the casting part’s feeder. The proposed methodology has been demonstrated in the design of the feeder for casting Pelton turbine bucket. The results of the optimization are dimensional characteristics of the feeder, and the best result from all the implemented optimization processes has been adopted. Numerical simulation has been used to verify the validity of the presented design methodology and the feeding system optimization in the casting system of the Pelton turbine bucket.

  6. Coniferous Canopy BRF Simulation Based on 3-D Realistic Scene

    Science.gov (United States)

    Wang, Xin-yun; Guo, Zhi-feng; Qin, Wen-han; Sun, Guo-qing

    2011-01-01

    It is difficulties for the computer simulation method to study radiation regime at large-scale. Simplified coniferous model was investigate d in the present study. It makes the computer simulation methods such as L-systems and radiosity-graphics combined method (RGM) more powerf ul in remote sensing of heterogeneous coniferous forests over a large -scale region. L-systems is applied to render 3-D coniferous forest scenarios: and RGM model was used to calculate BRF (bidirectional refle ctance factor) in visible and near-infrared regions. Results in this study show that in most cases both agreed well. Meanwhiie at a tree and forest level. the results are also good.

  7. Simulation of Cell Dielectric Properties Based on COMSOL

    Directory of Open Access Journals (Sweden)

    Shudong Li

    2018-03-01

    Full Text Available The dielectric properties of cells can be observed by injecting a low amplitude current at different frequencies (1MHz~100MHz. The simulation research is taken on the software platform named COMSOL Multiphysics. The electric field and the cell model is created with prior information. By simulation, itrs verified that at low frequencies, the region of interest (ROI behaves the conductivity characteristic while the electrical signal cannot pass through the cell membrane due to its capacitor properties. With the excitation frequency increasing, the ROI behaves more permittivity characteristic that the current flowing through the cell membrane becomes more and the current density increases. The research of the cell dielectric properties provides an auxiliary method to diagnose the status of the cell.

  8. The Evaluation of ERP Sandtable Simulation Based on AHP

    Science.gov (United States)

    Xu, Lan

    Due to the trend of world globalization, many enterprises have extended their business to operate globally. Enterprise resource planning is a powerful management system providing the best business resources information. This paper proposed the theory of AHP, and presented ERP sandtable simulation evaluation to discuss how to make a decision using AHP. Using this method can make enterprises consider factors influence operation of enterprise adequately, including feedback and dependence among the factors.

  9. Emergency Evacuation of Hazardous Chemical Accidents Based on Diffusion Simulation

    OpenAIRE

    Jiang-Hua Zhang; Hai-Yue Liu; Rui Zhu; Yang Liu

    2017-01-01

    The recent rapid development of information technology, such as sensing technology, communications technology, and database, allows us to use simulation experiments for analyzing serious accidents caused by hazardous chemicals. Due to the toxicity and diffusion of hazardous chemicals, these accidents often lead to not only severe consequences and economic losses, but also traffic jams at the same time. Emergency evacuation after hazardous chemical accidents is an effective means to reduce the...

  10. Voronoi Based Nanocrystalline Generation Algorithm for Atomistic Simulations

    Science.gov (United States)

    2016-12-22

    shown by the screen shot in Fig. 4. First, a 10-nm grain structure is created in a 15- × 15- × 15-nm simulation cell. Here, each grain con - tains an...configuration file saved as Cu_NC_centroids.config. The nanocrystal_builder.py script is invoked a second time to demonstrate the Con - fig Mode in the lower...distributionisunlimited. output_name = raw_input(’Input desired output basename:\

  11. Risk of portfolio with simulated returns based on copula model

    Science.gov (United States)

    Razak, Ruzanna Ab; Ismail, Noriszura

    2015-02-01

    The commonly used tool for measuring risk of a portfolio with equally weighted stocks is variance-covariance method. Under extreme circumstances, this method leads to significant underestimation of actual risk due to its multivariate normality assumption of the joint distribution of stocks. The purpose of this research is to compare the actual risk of portfolio with the simulated risk of portfolio in which the joint distribution of two return series is predetermined. The data used is daily stock prices from the ASEAN market for the period January 2000 to December 2012. The copula approach is applied to capture the time varying dependence among the return series. The results shows that the chosen copula families are not suitable to present the dependence structures of each bivariate returns. Exception for the Philippines-Thailand pair where by t copula distribution appears to be the appropriate choice to depict its dependence. Assuming that the t copula distribution is the joint distribution of each paired series, simulated returns is generated and value-at-risk (VaR) is then applied to evaluate the risk of each portfolio consisting of two simulated return series. The VaR estimates was found to be symmetrical due to the simulation of returns via elliptical copula-GARCH approach. By comparison, it is found that the actual risks are underestimated for all pairs of portfolios except for Philippines-Thailand. This study was able to show that disregard of the non-normal dependence structure of two series will result underestimation of actual risk of the portfolio.

  12. Simulation-based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison

    2012-11-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. This is particularly true in pediatric cardiology, due to the wide variation in anatomy observed in congenital heart disease patients. While medical imaging provides increasingly detailed anatomical information, clinicians currently have limited knowledge of important fluid mechanical parameters. Treatment decisions are therefore often made using anatomical information alone, despite the known links between fluid mechanics and disease progression. Patient-specific simulations now offer the means to provide this missing information, and, more importantly, to perform in-silico testing of new surgical designs at no risk to the patient. In this talk, we will outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We will then present new methodology for coupling optimization with simulation and uncertainty quantification to customize treatments for individual patients. Finally, we will present examples in pediatric cardiology that illustrate the potential impact of these tools in the clinical setting.

  13. Simulation of computed tomography dose based on voxel phantom

    Science.gov (United States)

    Liu, Chunyu; Lv, Xiangbo; Li, Zhaojun

    2017-01-01

    Computed Tomography (CT) is one of the preferred and the most valuable imaging tool used in diagnostic radiology, which provides a high-quality cross-sectional image of the body. It still causes higher doses of radiation to patients comparing to the other radiological procedures. The Monte-Carlo method is appropriate for estimation of the radiation dose during the CT examinations. The simulation of the Computed Tomography Dose Index (CTDI) phantom was developed in this paper. Under a similar conditions used in physical measurements, dose profiles were calculated and compared against the measured values that were reported. The results demonstrate a good agreement between the calculated and the measured doses. From different CT exam simulations using the voxel phantom, the highest absorbed dose was recorded for the lung, the brain, the bone surface. A comparison between the different scan type shows that the effective dose for a chest scan is the highest one, whereas the effective dose values during abdomen and pelvis scan are very close, respectively. The lowest effective dose resulted from the head scan. Although, the dose in CT is related to various parameters, such as the tube current, exposure time, beam energy, slice thickness and patient size, this study demonstrates that the MC simulation is a useful tool to accurately estimate the dose delivered to any specific organs for patients undergoing the CT exams and can be also a valuable technique for the design and the optimization of the CT x-ray source.

  14. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    Science.gov (United States)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  15. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  16. [The virtual reality simulation research of China Mechanical Virtual Human based on the Creator/Vega].

    Science.gov (United States)

    Wei, Gaofeng; Tang, Gang; Fu, Zengliang; Sun, Qiuming; Tian, Feng

    2010-10-01

    The China Mechanical Virtual Human (CMVH) is a human musculoskeletal biomechanical simulation platform based on China Visible Human slice images; it has great realistic application significance. In this paper is introduced the construction method of CMVH 3D models. Then a simulation system solution based on Creator/Vega is put forward for the complex and gigantic data characteristics of the 3D models. At last, combined with MFC technology, the CMVH simulation system is developed and a running simulation scene is given. This paper provides a new way for the virtual reality application of CMVH.

  17. iCrowd: agent-based behavior modeling and crowd simulator

    Science.gov (United States)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  18. Internet-based simulation of resource requirement of buildings; Internetbasierte Simulation des Ressourcenbedarfs von Bauwerken

    Energy Technology Data Exchange (ETDEWEB)

    Neuberg, F.; Rank, E. [Technische Univ. Muenchen, Lehrstuhl fuer Bauinformatik, Muenchen (Germany); Ekkerlein, C.; Faulstich, M. [Technische Univ. Muenchen, Lehrstuhl fuer Wasserguete- und Abfallwirtschaft, Garching (Germany)

    2002-12-01

    Due to the long life cycle of a building the expenses for usage and maintenance are very high compared to those for the construction, only. About 1/3 of the total energy demand in Germany is used for heating, air-conditioning and hot water supply in buildings. In addition the building wastes resulting for disposal represent a quantitatively enormous material flow. As the most significant decisions are made in early planning steps it is very important to enable planners to estimate the energy demand, resource requirement and ecological impact of different scenarios at this stage. Nowadays more and more architects and engineers use CAD software, like the Architectural Desktop from Autodesk, supporting the generation of three dimensional product models. One prominent model being now well established in the building industry is defined by the Industry Foundation Classes (IFC). This model is also the starting point of our resource oriented design system. To get a basis for a comparative assessment of different design variants it is first necessary to augment the product model with information on resource consumption of the used materials. Therefore a database server accessible via the Internet is developed within our research project providing a service for planners to extend their product model with information necessary for life cycle assessment (LCA) and building energy simulation (EnEV). The data exchange format for the property sets is ifcXML. In our presentation the general software concept together with first simulation tools will be discussed. (orig.)

  19. Passive heat transfer in a turbulent channel flow simulation using large eddy simulation based on the lattice Boltzmann method framework

    Energy Technology Data Exchange (ETDEWEB)

    Wu Hong [National Key Laboratory of Science and Technology on Aero-Engine Aero-Thermodynamics, Beihang University, Beijing 100191 (China); Wang Jiao, E-mail: wangjiao@sjp.buaa.edu.cn [National Key Laboratory of Science and Technology on Aero-Engine Aero-Thermodynamics, Beihang University, Beijing 100191 (China); Tao Zhi [National Key Laboratory of Science and Technology on Aero-Engine Aero-Thermodynamics, Beihang University, Beijing 100191 (China)

    2011-12-15

    Highlights: Black-Right-Pointing-Pointer A double MRT-LBM is used to study heat transfer in turbulent channel flow. Black-Right-Pointing-Pointer Turbulent Pr is modeled by dynamic subgrid scale model. Black-Right-Pointing-Pointer Temperature gradients are calculated by the non-equilibrium temperature distribution moments. - Abstract: In this paper, a large eddy simulation based on the lattice Boltzmann framework is carried out to simulate the heat transfer in a turbulent channel flow, in which the temperature can be regarded as a passive scalar. A double multiple relaxation time (DMRT) thermal lattice Boltzmann model is employed. While applying DMRT, a multiple relaxation time D3Q19 model is used to simulate the flow field, and a multiple relaxation time D3Q7 model is used to simulate the temperature field. The dynamic subgrid stress model, in which the turbulent eddy viscosity and the turbulent Prandtl number are dynamically computed, is integrated to describe the subgrid effect. Not only the strain rate but also the temperature gradient is calculated locally by the non-equilibrium moments. The Reynolds number based on the shear velocity and channel half height is 180. The molecular Prandtl numbers are set to be 0.025 and 0.71. Statistical quantities, such as the average velocity, average temperature, Reynolds stress, root mean square (RMS) velocity fluctuations, RMS temperature and turbulent heat flux are obtained and compared with the available data. The results demonstrate great reliability of DMRT-LES in studying turbulence.

  20. Efficient graph-based dynamic load-balancing for parallel large-scale agent-based traffic simulation

    NARCIS (Netherlands)

    Xu, Y.; Cai, W.; Aydt, H.; Lees, M.; Tolk, A.; Diallo, S.Y.; Ryzhov, I.O.; Yilmaz, L.; Buckley, S.; Miller, J.A.

    2014-01-01

    One of the issues of parallelizing large-scale agent-based traffic simulations is partitioning and load-balancing. Traffic simulations are dynamic applications where the distribution of workload in the spatial domain constantly changes. Dynamic load-balancing at run-time has shown better efficiency