WorldWideScience

Sample records for mc simulation method

  1. Biasing transition rate method based on direct MC simulation for probabilistic safety assessment

    Institute of Scientific and Technical Information of China (English)

    Xiao-Lei Pan; Jia-Qun Wang; Run Yuan; Fang Wang; Han-Qing Lin; Li-Qin Hu; Jin Wang

    2017-01-01

    Direct Monte Carlo (MC) simulation is a powerful probabilistic safety assessment method for accounting dynamics of the system.But it is not efficient at simulating rare events.A biasing transition rate method based on direct MC simulation is proposed to solve the problem in this paper.This method biases transition rates of the components by adding virtual components to them in series to increase the occurrence probability of the rare event,hence the decrease in the variance of MC estimator.Several cases are used to benchmark this method.The results show that the method is effective at modeling system failure and is more efficient at collecting evidence of rare events than the direct MC simulation.The performance is greatly improved by the biasing transition rate method.

  2. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Yican; Song, Jing; Zheng, Huaqing; Sun, Guangyao; Hao, Lijuan; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • The new developed CAD-based Monte Carlo program named SuperMC for integrated simulation of nuclear system makes use of hybrid MC-deterministic method and advanced computer technologies. SuperMC is designed to perform transport calculation of various types of particles, depletion and activation calculation including isotope burn-up, material activation and shutdown dose, and multi-physics coupling calculation including thermo-hydraulics, fuel performance and structural mechanics. The bi-directional automatic conversion between general CAD models and physical settings and calculation models can be well performed. Results and process of simulation can be visualized with dynamical 3D dataset and geometry model. Continuous-energy cross section, burnup, activation, irradiation damage and material data etc. are used to support the multi-process simulation. Advanced cloud computing framework makes the computation and storage extremely intensive simulation more attractive just as a network service to support design optimization and assessment. The modular design and generic interface promotes its flexible manipulation and coupling of external solvers. • The new developed and incorporated advanced methods in SuperMC was introduced including hybrid MC-deterministic transport method, particle physical interaction treatment method, multi-physics coupling calculation method, geometry automatic modeling and processing method, intelligent data analysis and visualization method, elastic cloud computing technology and parallel calculation method. • The functions of SuperMC2.1 integrating automatic modeling, neutron and photon transport calculation, results and process visualization was introduced. It has been validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model. - Abstract: Monte Carlo (MC) method has distinct advantages to simulate complicated nuclear systems and is envisioned as a routine

  3. Integration of OpenMC methods into MAMMOTH and Serpent

    Energy Technology Data Exchange (ETDEWEB)

    Kerby, Leslie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Idaho State Univ., Idaho Falls, ID (United States); DeHart, Mark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Tumulak, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States); Univ. of Michigan, Ann Arbor, MI (United States)

    2016-09-01

    OpenMC, a Monte Carlo particle transport simulation code focused on neutron criticality calculations, contains several methods we wish to emulate in MAMMOTH and Serpent. First, research coupling OpenMC and the Multiphysics Object-Oriented Simulation Environment (MOOSE) has shown promising results. Second, the utilization of Functional Expansion Tallies (FETs) allows for a more efficient passing of multiphysics data between OpenMC and MOOSE. Both of these capabilities have been preliminarily implemented into Serpent. Results are discussed and future work recommended.

  4. Developing an interface between MCNP and McStas for simulation of neutron moderators

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Lauritzen, Bent; Nonbøl, Erik

    2012-01-01

    Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using MCNP/X whereas simulations of neutron transport and instrument performance are carried out by neutron ray tracing codes such as McStas. The coupling between the two simulations suites...... typically consists of providing analytical fits from MCNP/X neutron spectra to McStas. This method is generally successful, but as will be discussed in the this paper, there are limitations and a more direct coupling between MCNP/X andMcStas could allow for more accurate simulations of e.g. complex...... moderator geometries, interference between beamlines as well as shielding requirements along the neutron guides. In this paper different possible interfaces between McStas and MCNP/X are discussed and first preliminary performance results are shown....

  5. Validation of the intrinsic spatial efficiency method for non cylindrical homogeneous sources using MC simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz-Ramírez, Pablo, E-mail: rapeitor@ug.uchile.cl; Ruiz, Andrés [Departamento de Física, Facultad de Ciencias, Universidad de Chile (Chile)

    2016-07-07

    The Monte Carlo simulation of the gamma spectroscopy systems is a common practice in these days. The most popular softwares to do this are MCNP and Geant4 codes. The intrinsic spatial efficiency method is a general and absolute method to determine the absolute efficiency of a spectroscopy system for any extended sources, but this was only demonstrated experimentally for cylindrical sources. Due to the difficulty that the preparation of sources with any shape represents, the simplest way to do this is by the simulation of the spectroscopy system and the source. In this work we present the validation of the intrinsic spatial efficiency method for sources with different geometries and for photons with an energy of 661.65 keV. In the simulation the matrix effects (the auto-attenuation effect) are not considered, therefore these results are only preliminaries. The MC simulation is carried out using the FLUKA code and the absolute efficiency of the detector is determined using two methods: the statistical count of Full Energy Peak (FEP) area (traditional method) and the intrinsic spatial efficiency method. The obtained results show total agreement between the absolute efficiencies determined by the traditional method and the intrinsic spatial efficiency method. The relative bias is lesser than 1% in all cases.

  6. Interfacing MCNPX and McStas for simulation of neutron transport

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Lauritzen, Bent; Nonbøl, Erik

    2013-01-01

    Stas[4, 5, 6, 7]. The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve......Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX[1] or FLUKA[2, 3] whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as Mc...... geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides....

  7. Methods for Monte Carlo simulations of biomacromolecules.

    Science.gov (United States)

    Vitalis, Andreas; Pappu, Rohit V

    2009-01-01

    The state-of-the-art for Monte Carlo (MC) simulations of biomacromolecules is reviewed. Available methodologies for sampling conformational equilibria and associations of biomacromolecules in the canonical ensemble, given a continuum description of the solvent environment, are reviewed. Detailed sections are provided dealing with the choice of degrees of freedom, the efficiencies of MC algorithms and algorithmic peculiarities, as well as the optimization of simple movesets. The issue of introducing correlations into elementary MC moves, and the applicability of such methods to simulations of biomacromolecules is discussed. A brief discussion of multicanonical methods and an overview of recent simulation work highlighting the potential of MC methods are also provided. It is argued that MC simulations, while underutilized biomacromolecular simulation community, hold promise for simulations of complex systems and phenomena that span multiple length scales, especially when used in conjunction with implicit solvation models or other coarse graining strategies.

  8. New developments in the McStas neutron instrument simulation package

    International Nuclear Information System (INIS)

    Willendrup, P K; Knudsen, E B; Klinkby, E; Nielsen, T; Farhi, E; Filges, U; Lefmann, K

    2014-01-01

    The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.

  9. Monte Carlo simulations of neutron-scattering instruments using McStas

    DEFF Research Database (Denmark)

    Nielsen, K.; Lefmann, K.

    2000-01-01

    Monte Carlo simulations have become an essential tool for improving the performance of neutron-scattering instruments, since the level of sophistication in the design of instruments is defeating purely analytical methods. The program McStas, being developed at Rise National Laboratory, includes...

  10. An improved Monte Carlo (MC) dose simulation for charged particle cancer therapy

    Energy Technology Data Exchange (ETDEWEB)

    Ying, C. K. [Advanced Medical and Dental Institute, AMDI, Universiti Sains Malaysia, Penang, Malaysia and School of Medical Sciences, Universiti Sains Malaysia, Kota Bharu (Malaysia); Kamil, W. A. [Advanced Medical and Dental Institute, AMDI, Universiti Sains Malaysia, Penang, Malaysia and Radiology Department, Hospital USM, Kota Bharu (Malaysia); Shuaib, I. L. [Advanced Medical and Dental Institute, AMDI, Universiti Sains Malaysia, Penang (Malaysia); Matsufuji, Naruhiro [Research Centre of Charged Particle Therapy, National Institute of Radiological Sciences, NIRS, Chiba (Japan)

    2014-02-12

    Heavy-particle therapy such as carbon ion therapy are more popular nowadays because of the nature characteristics of charged particle and almost no side effect to patients. An effective treatment is achieved with high precision of dose calculation, in this research work, Geant4 based Monte Carlo simulation method has been used to calculate the radiation transport and dose distribution. The simulation have the same setting with the treatment room in Heavy Ion Medical Accelerator, HIMAC. The carbon ion beam at the isocentric gantry nozzle for the therapeutic energy of 290 MeV/u was simulated, experimental work was carried out in National Institute of Radiological Sciences, NIRS, Chiba, Japan by using the HIMAC to confirm the accuracy and qualities dose distribution by MC methods. The Geant4 based simulated dose distribution were verified with measurements for Bragg peak and spread out Bragg peak (SOBP) respectively. The verification of results shows that the Bragg peak depth-dose and SOBP distributions in simulation has good agreement with measurements. In overall, the study showed that Geant4 based can be fully applied in the heavy-ion therapy field for simulation, further works need to be carry on to refine and improve the Geant4 MC simulations.

  11. An improved Monte Carlo (MC) dose simulation for charged particle cancer therapy

    International Nuclear Information System (INIS)

    Ying, C. K.; Kamil, W. A.; Shuaib, I. L.; Matsufuji, Naruhiro

    2014-01-01

    Heavy-particle therapy such as carbon ion therapy are more popular nowadays because of the nature characteristics of charged particle and almost no side effect to patients. An effective treatment is achieved with high precision of dose calculation, in this research work, Geant4 based Monte Carlo simulation method has been used to calculate the radiation transport and dose distribution. The simulation have the same setting with the treatment room in Heavy Ion Medical Accelerator, HIMAC. The carbon ion beam at the isocentric gantry nozzle for the therapeutic energy of 290 MeV/u was simulated, experimental work was carried out in National Institute of Radiological Sciences, NIRS, Chiba, Japan by using the HIMAC to confirm the accuracy and qualities dose distribution by MC methods. The Geant4 based simulated dose distribution were verified with measurements for Bragg peak and spread out Bragg peak (SOBP) respectively. The verification of results shows that the Bragg peak depth-dose and SOBP distributions in simulation has good agreement with measurements. In overall, the study showed that Geant4 based can be fully applied in the heavy-ion therapy field for simulation, further works need to be carry on to refine and improve the Geant4 MC simulations

  12. An improved Monte Carlo (MC) dose simulation for charged particle cancer therapy

    International Nuclear Information System (INIS)

    Ying, C.K.; Kamil, W.A.; Shuaib, I.L.; Ying, C.K.; Kamil, W.A.

    2013-01-01

    Full-text: Heavy-particle therapy such as carbon ion therapy are more popular nowadays because of the nature characteristics of charged particle and almost no side effect to patients. An effective treatment is achieved with high precision of dose calculation, in this research work, Geant4 based Monte Carlo simulation method has been used to calculate the radiation transport and dose distribution. The simulation have the same setting with the treatment room in Heavy Ion Medical Accelerator, HIMAC. The carbon ion beam at the isocentric gantry nozzle for the therapeutic energy of 290 MeV/u was simulated, experimental work was carried out in National Institute of Radiological Sciences, NIRS, Chiba, Japan by using the HIMAC to confirm the accuracy and qualities dose distribution by MC methods. The Geant4 based simulated dose distribution were verified with measurements for Bragg peak and spread out Bragg peak (SOBP) respectively. The verification of results shows that the Bragg peak depth-dose and SOBP distributions in simulation has good agreement with measurements. In overall, the study showed that Geant4 based can be fully applied in the heavy ion therapy field for simulation, further works need to be carry on to refine and improve the Geant4 MC simulations. (author)

  13. Virtual reality-based simulation system for nuclear and radiation safety SuperMC/RVIS

    Energy Technology Data Exchange (ETDEWEB)

    He, T.; Hu, L.; Long, P.; Shang, L.; Zhou, S.; Yang, Q.; Zhao, J.; Song, J.; Yu, S.; Cheng, M.; Hao, L., E-mail: liqin.hu@fds.org.cn [Chinese Academy of Sciences, Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Hefei, Anhu (China)

    2015-07-01

    The suggested work scenarios in radiation environment need to be iterative optimized according to the ALARA principle. Based on Virtual Reality (VR) technology and high-precision whole-body computational voxel phantom, a virtual reality-based simulation system for nuclear and radiation safety named SuperMC/RVIS has been developed for organ dose assessment and ALARA evaluation of work scenarios in radiation environment. The system architecture, ALARA evaluation strategy, advanced visualization methods and virtual reality technology used in SuperMC/RVIS are described. A case is presented to show its dose assessment and interactive simulation capabilities. (author)

  14. Virtual reality-based simulation system for nuclear and radiation safety SuperMC/RVIS

    International Nuclear Information System (INIS)

    He, T.; Hu, L.; Long, P.; Shang, L.; Zhou, S.; Yang, Q.; Zhao, J.; Song, J.; Yu, S.; Cheng, M.; Hao, L.

    2015-01-01

    The suggested work scenarios in radiation environment need to be iterative optimized according to the ALARA principle. Based on Virtual Reality (VR) technology and high-precision whole-body computational voxel phantom, a virtual reality-based simulation system for nuclear and radiation safety named SuperMC/RVIS has been developed for organ dose assessment and ALARA evaluation of work scenarios in radiation environment. The system architecture, ALARA evaluation strategy, advanced visualization methods and virtual reality technology used in SuperMC/RVIS are described. A case is presented to show its dose assessment and interactive simulation capabilities. (author)

  15. CAD-based Monte Carlo program for integrated simulation of nuclear system SuperMC

    International Nuclear Information System (INIS)

    Wu, Y.; Song, J.; Zheng, H.; Sun, G.; Hao, L.; Long, P.; Hu, L.

    2013-01-01

    SuperMC is a (Computer-Aided-Design) CAD-based Monte Carlo (MC) program for integrated simulation of nuclear systems developed by FDS Team (China), making use of hybrid MC-deterministic method and advanced computer technologies. The design aim, architecture and main methodology of SuperMC are presented in this paper. The taking into account of multi-physics processes and the use of advanced computer technologies such as automatic geometry modeling, intelligent data analysis and visualization, high performance parallel computing and cloud computing, contribute to the efficiency of the code. SuperMC2.1, the latest version of the code for neutron, photon and coupled neutron and photon transport calculation, has been developed and validated by using a series of benchmarking cases such as the fusion reactor ITER model and the fast reactor BN-600 model

  16. McStas 1.1. A freeware package for neutron Monte Carlo ray-tracing simulations

    International Nuclear Information System (INIS)

    Lefmann, K.; Nielsen, K.

    1999-01-01

    Neutron simulation is becoming an indispensable tool for neutron instrument design. At Risoe National Laboratory, a user-friendly, versatile, and fast simulation package, McStas has been developed, which may be freely downloaded from our website. An instrument is described in the McStas meta-language and is composed of elements from the McStas component library, which is under constant development and debugging by both the users and us. The McStas front- and back-ends take care of performing the simulations and displaying their results, respectively. McStas 1.1 facilities detailed simulations of complicated triple-axis instruments like the Riso RITA spectrometer, and it is equally well equipped for time-of flight spectrometers. At ECNS'99, a brief tutorial of McStas including a few on-line demonstrations is presented. Further, results from the latest simulation work in the growing McStas user group are presented and the future of this project is discussed. (author)

  17. CloudMC: a cloud computing application for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-01-01

    This work presents CloudMC, a cloud computing application—developed in Windows Azure®, the platform of the Microsoft® cloud—for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based—the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice. (note)

  18. CloudMC: a cloud computing application for Monte Carlo simulation.

    Science.gov (United States)

    Miras, H; Jiménez, R; Miras, C; Gomà, C

    2013-04-21

    This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.

  19. An Efficient Simulation Method for Rare Events

    KAUST Repository

    Rached, Nadhir B.

    2015-01-07

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. Closed-form expressions for the sum distribution do not generally exist, which has led to an increasing interest in simulation approaches. A crude Monte Carlo (MC) simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. Variance reduction techniques are alternative approaches that can improve the computational efficiency of naive MC simulations. We propose an Importance Sampling (IS) simulation technique based on the well-known hazard rate twisting approach, that presents the advantage of being asymptotically optimal for any arbitrary RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of our method with that of an algorithm based on a conditional MC technique.

  20. An Efficient Simulation Method for Rare Events

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Kammoun, Abla; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. Closed-form expressions for the sum distribution do not generally exist, which has led to an increasing interest in simulation approaches. A crude Monte Carlo (MC) simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. Variance reduction techniques are alternative approaches that can improve the computational efficiency of naive MC simulations. We propose an Importance Sampling (IS) simulation technique based on the well-known hazard rate twisting approach, that presents the advantage of being asymptotically optimal for any arbitrary RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of our method with that of an algorithm based on a conditional MC technique.

  1. McStas

    DEFF Research Database (Denmark)

    Willendrup, Peter Kjær; Farhi, Emmanuel; Bergbäck Knudsen, Erik

    2014-01-01

    experiments. McStas is being actively used for the design-update of the European Spallation Source (ESS) in Lund. This paper includes an introduction to the McStas package, recent and ongoing simulation projects. Further, new features in releases McStas 1.12c and 2.0 are discussed.......The McStas neutron ray-tracing simulation package is a collaboration between Risø DTU, ILL, University of Copenhagen and the PSI. During its lifetime, McStas has evolved to become the world leading software in the area of neutron scattering simulations for instrument design, optimisation, virtual...

  2. Research on Monte Carlo simulation method of industry CT system

    International Nuclear Information System (INIS)

    Li Junli; Zeng Zhi; Qui Rui; Wu Zhen; Li Chunyan

    2010-01-01

    There are a series of radiation physical problems in the design and production of industry CT system (ICTS), including limit quality index analysis; the effect of scattering, efficiency of detectors and crosstalk to the system. Usually the Monte Carlo (MC) Method is applied to resolve these problems. Most of them are of little probability, so direct simulation is very difficult, and existing MC methods and programs can't meet the needs. To resolve these difficulties, particle flux point auto-important sampling (PFPAIS) is given on the basis of auto-important sampling. Then, on the basis of PFPAIS, a particular ICTS simulation method: MCCT is realized. Compared with existing MC methods, MCCT is proved to be able to simulate the ICTS more exactly and effectively. Furthermore, the effects of all kinds of disturbances of ICTS are simulated and analyzed by MCCT. To some extent, MCCT can guide the research of the radiation physical problems in ICTS. (author)

  3. McStas 1.1: A tool for building neutron Monte Carlo simulations

    DEFF Research Database (Denmark)

    Lefmann, K.; Nielsen, K.; Tennant, D.A.

    2000-01-01

    McStas is a project to develop general tools for the creation of simulations of neutron scattering experiments. In this paper, we briefly introduce McStas and describe a particular application of the program: the Monte Carlo calculation of the resolution function of a standard triple-axis neutron...

  4. GPM GROUND VALIDATION SATELLITE SIMULATED ORBITS MC3E V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation Satellite Simulated Orbits MC3E dataset is available in the Orbital database , which takes account for the atmospheric profiles, the...

  5. Simulating Controlled Radical Polymerizations with mcPolymer—A Monte Carlo Approach

    Directory of Open Access Journals (Sweden)

    Georg Drache

    2012-07-01

    Full Text Available Utilizing model calculations may lead to a better understanding of the complex kinetics of the controlled radical polymerization. We developed a universal simulation tool (mcPolymer, which is based on the widely used Monte Carlo simulation technique. This article focuses on the software architecture of the program, including its data management and optimization approaches. We were able to simulate polymer chains as individual objects, allowing us to gain more detailed microstructural information of the polymeric products. For all given examples of controlled radical polymerization (nitroxide mediated radical polymerization (NMRP homo- and copolymerization, atom transfer radical polymerization (ATRP, reversible addition fragmentation chain transfer polymerization (RAFT, we present detailed performance analyses demonstrating the influence of the system size, concentrations of reactants, and the peculiarities of data. Different possibilities were exemplarily illustrated for finding an adequate balance between precision, memory consumption, and computation time of the simulation. Due to its flexible software architecture, the application of mcPolymer is not limited to the controlled radical polymerization, but can be adjusted in a straightforward manner to further polymerization models.

  6. Simulation of streamflow in the McTier Creek watershed, South Carolina

    Science.gov (United States)

    Feaster, Toby D.; Golden, Heather E.; Odom, Kenneth R.; Lowery, Mark A.; Conrads, Paul; Bradley, Paul M.

    2010-01-01

    The McTier Creek watershed is located in the Sand Hills ecoregion of South Carolina and is a small catchment within the Edisto River Basin. Two watershed hydrology models were applied to the McTier Creek watershed as part of a larger scientific investigation to expand the understanding of relations among hydrologic, geochemical, and ecological processes that affect fish-tissue mercury concentrations within the Edisto River Basin. The two models are the topography-based hydrological model (TOPMODEL) and the grid-based mercury model (GBMM). TOPMODEL uses the variable-source area concept for simulating streamflow, and GBMM uses a spatially explicit modified curve-number approach for simulating streamflow. The hydrologic output from TOPMODEL can be used explicitly to simulate the transport of mercury in separate applications, whereas the hydrology output from GBMM is used implicitly in the simulation of mercury fate and transport in GBMM. The modeling efforts were a collaboration between the U.S. Geological Survey and the U.S. Environmental Protection Agency, National Exposure Research Laboratory. Calibrations of TOPMODEL and GBMM were done independently while using the same meteorological data and the same period of record of observed data. Two U.S. Geological Survey streamflow-gaging stations were available for comparison of observed daily mean flow with simulated daily mean flow-station 02172300, McTier Creek near Monetta, South Carolina, and station 02172305, McTier Creek near New Holland, South Carolina. The period of record at the Monetta gage covers a broad range of hydrologic conditions, including a drought and a significant wet period. Calibrating the models under these extreme conditions along with the normal flow conditions included in the record enhances the robustness of the two models. Several quantitative assessments of the goodness of fit between model simulations and the observed daily mean flows were done. These included the Nash-Sutcliffe coefficient

  7. Qualification test of few group constants generated from an MC method by the two-step neutronics analysis system McCARD/MASTER

    International Nuclear Information System (INIS)

    Park, Ho Jin; Shim, Hyung Jin; Joo, Han Gyu; Kim, Chang Hyo

    2011-01-01

    The purpose of this paper is to examine the qualification of few group constants estimated by the Seoul National University Monte Carlo particle transport analysis code McCARD in terms of core neutronics analyses and thus to validate the McCARD method as a few group constant generator. The two- step core neutronics analyses are conducted for a mini and a realistic PWR by the McCARD/MASTER code system in which McCARD is used as an MC group constant generation code and MASTER as a diffusion core analysis code. The two-step calculations for the effective multiplication factors and assembly power distributions of the two PWR cores by McCARD/MASTER are compared with the reference McCARD calculations. By showing excellent agreements between McCARD/MASTER and the reference MC core neutronics analyses for the two PWRs, it is concluded that the MC method implemented in McCARD can generate few group constants which are well qualified for high-accuracy two-step core neutronics calculations. (author)

  8. OpenMC In Situ Source Convergence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldrich, Garrett Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Univ. of California, Davis, CA (United States); Dutta, Soumya [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); The Ohio State Univ., Columbus, OH (United States); Woodring, Jonathan Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are able to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.

  9. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    Science.gov (United States)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  10. New developments in the McStas neutron instrument simulation package

    DEFF Research Database (Denmark)

    Willendrup, Peter Kjær; Bergbäck Knudsen, Erik; Klinkby, Esben Bryndt

    2014-01-01

    , virtual experiments, data analysis and user training. McStas was founded as a scienti_c, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013......), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics....

  11. Combined Monte Carlo and path-integral method for simulated library of time-resolved reflectance curves from layered tissue models

    Science.gov (United States)

    Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann

    2009-02-01

    Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.

  12. The OpenMC Monte Carlo particle transport code

    International Nuclear Information System (INIS)

    Romano, Paul K.; Forget, Benoit

    2013-01-01

    Highlights: ► An open source Monte Carlo particle transport code, OpenMC, has been developed. ► Solid geometry and continuous-energy physics allow high-fidelity simulations. ► Development has focused on high performance and modern I/O techniques. ► OpenMC is capable of scaling up to hundreds of thousands of processors. ► Results on a variety of benchmark problems agree with MCNP5. -- Abstract: A new Monte Carlo code called OpenMC is currently under development at the Massachusetts Institute of Technology as a tool for simulation on high-performance computing platforms. Given that many legacy codes do not scale well on existing and future parallel computer architectures, OpenMC has been developed from scratch with a focus on high performance scalable algorithms as well as modern software design practices. The present work describes the methods used in the OpenMC code and demonstrates the performance and accuracy of the code on a variety of problems.

  13. Simulating vegetation response to climate change in the Blue Mountains with MC2 dynamic global vegetation model

    Directory of Open Access Journals (Sweden)

    John B. Kim

    2018-04-01

    Full Text Available Warming temperatures are projected to greatly alter many forests in the Pacific Northwest. MC2 is a dynamic global vegetation model, a climate-aware, process-based, and gridded vegetation model. We calibrated and ran MC2 simulations for the Blue Mountains Ecoregion, Oregon, USA, at 30 arc-second spatial resolution. We calibrated MC2 using the best available spatial datasets from land managers. We ran future simulations using climate projections from four global circulation models (GCM under representative concentration pathway 8.5. Under this scenario, forest productivity is projected to increase as the growing season lengthens, and fire occurrence is projected to increase steeply throughout the century, with burned area peaking early- to mid-century. Subalpine forests are projected to disappear, and the coniferous forests to contract by 32.8%. Large portions of the dry and mesic forests are projected to convert to woodlands, unless precipitation were to increase. Low levels of change are projected for the Umatilla National Forest consistently across the four GCM’s. For the Wallowa-Whitman and the Malheur National Forest, forest conversions are projected to vary more across the four GCM-based simulations, reflecting high levels of uncertainty arising from climate. For simulations based on three of the four GCMs, sharply increased fire activity results in decreases in forest carbon stocks by the mid-century, and the fire activity catalyzes widespread biome shift across the study area. We document the full cycle of a structured approach to calibrating and running MC2 for transparency and to serve as a template for applications of MC2. Keywords: Climate change, Regional change, Simulation, Calibration, Forests, Fire, Dynamic global vegetation model

  14. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.

    2015-06-08

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  15. A fast simulation method for the Log-normal sum distribution using a hazard rate twisting technique

    KAUST Repository

    Rached, Nadhir B.; Benkhelifa, Fatma; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The probability density function of the sum of Log-normally distributed random variables (RVs) is a well-known challenging problem. For instance, an analytical closed-form expression of the Log-normal sum distribution does not exist and is still an open problem. A crude Monte Carlo (MC) simulation is of course an alternative approach. However, this technique is computationally expensive especially when dealing with rare events (i.e. events with very small probabilities). Importance Sampling (IS) is a method that improves the computational efficiency of MC simulations. In this paper, we develop an efficient IS method for the estimation of the Complementary Cumulative Distribution Function (CCDF) of the sum of independent and not identically distributed Log-normal RVs. This technique is based on constructing a sampling distribution via twisting the hazard rate of the original probability measure. Our main result is that the estimation of the CCDF is asymptotically optimal using the proposed IS hazard rate twisting technique. We also offer some selected simulation results illustrating the considerable computational gain of the IS method compared to the naive MC simulation approach.

  16. Diffusion approximation-based simulation of stochastic ion channels: which method to use?

    Directory of Open Access Journals (Sweden)

    Danilo ePezo

    2014-11-01

    Full Text Available To study the effects of stochastic ion channel fluctuations on neural dynamics, several numerical implementation methods have been proposed. Gillespie’s method for Markov Chains (MC simulation is highly accurate, yet it becomes computationally intensive in the regime of high channel numbers. Many recent works aim to speed simulation time using the Langevin-based Diffusion Approximation (DA. Under this common theoretical approach, each implementation differs in how it handles various numerical difficulties – such as bounding of state variables to [0,1]. Here we review and test a set of the most recently published DA implementations (Dangerfield et al., 2012; Linaro et al., 2011; Huang et al., 2013a; Orio and Soudry, 2012; Schmandt and Galán, 2012; Goldwyn et al., 2011; Güler, 2013, comparing all of them in a set of numerical simulations that asses numerical accuracy and computational efficiency on three different models: the original Hodgkin and Huxley model, a model with faster sodium channels, and a multi-compartmental model inspired in granular cells. We conclude that for low channel numbers (usually below 1000 per simulated compartment one should use MC – which is both the most accurate and fastest method. For higher channel numbers, we recommend using the method by Orio and Soudry (2012, possibly combined with the method by Schmandt and Galán (2012 for increased speed and slightly reduced accuracy. Consequently, MC modelling may be the best method for detailed multicompartment neuron models – in which a model neuron with many thousands of channels is segmented into many compartments with a few hundred channels.

  17. ERSN-OpenMC, a Java-based GUI for OpenMC Monte Carlo code

    Directory of Open Access Journals (Sweden)

    Jaafar EL Bakkali

    2016-07-01

    Full Text Available OpenMC is a new Monte Carlo transport particle simulation code focused on solving two types of neutronic problems mainly the k-eigenvalue criticality fission source problems and external fixed fission source problems. OpenMC does not have any Graphical User Interface and the creation of one is provided by our java-based application named ERSN-OpenMC. The main feature of this application is to provide to the users an easy-to-use and flexible graphical interface to build better and faster simulations, with less effort and great reliability. Additionally, this graphical tool was developed with several features, as the ability to automate the building process of OpenMC code and related libraries as well as the users are given the freedom to customize their installation of this Monte Carlo code. A full description of the ERSN-OpenMC application is presented in this paper.

  18. MC Sensor—A Novel Method for Measurement of Muscle Tension

    Directory of Open Access Journals (Sweden)

    Sašo Tomažič

    2011-09-01

    Full Text Available This paper presents a new muscle contraction (MC sensor. This MC sensor is based on a novel principle whereby muscle tension is measured during muscle contractions. During the measurement, the sensor is fixed on the skin surface above the muscle, while the sensor tip applies pressure and causes an indentation of the skin and intermediate layer directly above the muscle and muscle itself. The force on the sensor tip is then measured. This force is roughly proportional to the tension of the muscle. The measurement is non-invasive and selective. Selectivity of MC measurement refers to the specific muscle or part of the muscle that is being measured and is limited by the size of the sensor tip. The sensor is relatively small and light so that the measurements can be performed while the measured subject performs different activities. Test measurements with this MC sensor on the biceps brachii muscle under isometric conditions (elbow angle 90° showed a high individual linear correlation between the isometric force and MC signal amplitudes (0.97 ≤ r ≤ 1. The measurements also revealed a strong correlation between the MC and electromyogram (EMG signals as well as good dynamic behaviour by the MC sensor. We believe that this MC sensor, when fully tested, will be a useful device for muscle mechanic diagnostics and that it will be complementary to existing methods.

  19. BER Performance Simulation of Generalized MC DS-CDMA System with Time-Limited Blackman Chip Waveform

    Directory of Open Access Journals (Sweden)

    I. Develi

    2010-09-01

    Full Text Available Multiple access interference encountered in multicarrier direct sequence-code division multiple access (MC DS-CDMA is the most important difficulty that depends mainly on the correlation properties of the spreading sequences as well as the shape of the chip waveforms employed. In this paper, bit error rate (BER performance of the generalized MC DS-CDMA system that employs time-limited Blackman chip waveform is presented for Nakagami-m fading channels. Simulation results show that the use of Blackman chip waveform can improve the BER performance of the generalized MC DS-CDMA system, as compared to the performances achieved by using timelimited chip waveforms in the literature.

  20. Precipitation kinetics in binary Fe–Cu and ternary Fe–Cu–Ni alloys via kMC method

    Directory of Open Access Journals (Sweden)

    Yi Wang

    2017-08-01

    Full Text Available The precipitation kinetics of coherent Cu rich precipitates (CRPs in binary Fe–Cu and ternary Fe–Cu–Ni alloys during thermal aging was modelled by the kinetic Monte Carlo method (kMC. A good agreement of the precipitation kinetics of Fe–Cu was found between the simulation and experimental results, as observed by means of advancement factor and cluster number density. This agreement was obtained owing to the correct description of the fast cluster mobility. The simulation results indicate that the effects of Ni are two-fold: Ni promotes the nucleation of Cu clusters; while the precipitation kinetics appears to be delayed by Ni addition during the coarsening stage. The apparent delayed precipitation kinetics is revealed to be related with the cluster mobility, which are reduced by Ni addition. The reduction effect of the cluster mobility weakens when the CRPs sizes increase. The results provide a view angle on the effects of solute elements upon Cu precipitation kinetics through the consideration of the non-conventional cluster growth mechanism, and kMC is verified to be a powerful approach on that.

  1. Verification of SuperMC with ITER C-Lite neutronic model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Shu [School of Nuclear Science and Technology, University of Science and Technology of China, Hefei, Anhui, 230027 (China); Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); Yu, Shengpeng [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China); He, Peng, E-mail: peng.he@fds.org.cn [Key Laboratory of Neutronics and Radiation Safety, Institute of Nuclear Energy Safety Technology, Chinese Academy of Sciences, Hefei, Anhui, 230031 (China)

    2016-12-15

    Highlights: • Verification of the SuperMC Monte Carlo transport code with ITER C-Lite model. • The modeling of the ITER C-Lite model using the latest SuperMC/MCAM. • All the calculated quantities are consistent with MCNP well. • Efficient variance reduction methods are adopted to accelerate the calculation. - Abstract: In pursit of accurate and high fidelity simulation, the reference model of ITER is becoming more and more detailed and complicated. Due to the complexity in geometry and the thick shielding of the reference model, the accurate modeling and precise simulaion of fusion neutronics are very challenging. Facing these difficulties, SuperMC, the Monte Carlo simulation software system developed by the FDS Team, has optimized its CAD interface for the automatic converting of more complicated models and increased its calculation efficiency with advanced variance reduction methods To demonstrate its capabilites of automatic modeling, neutron/photon coupled simulation and visual analysis for the ITER facility, numerical benchmarks using the ITER C-Lite neutronic model were performed. The nuclear heating in divertor and inboard toroidal field (TF) coils and a global neutron flux map were evaluated. All the calculated nuclear heating is compared with the results of the MCNP code and good consistencies between the two codes is shown. Using the global variance reduction methods in SuperMC, the average speed-up is 292 times for the calculation of inboard TF coils nuclear heating, and 91 times for the calculation of global flux map, compared with the analog run. These tests have shown that SuperMC is suitable for the design and analysis of ITER facility.

  2. The Simulation of the Recharging Method Based on Solar Radiation for an Implantable Biosensor.

    Science.gov (United States)

    Li, Yun; Song, Yong; Kong, Xianyue; Li, Maoyuan; Zhao, Yufei; Hao, Qun; Gao, Tianxin

    2016-09-10

    A method of recharging implantable biosensors based on solar radiation is proposed. Firstly, the models of the proposed method are developed. Secondly, the recharging processes based on solar radiation are simulated using Monte Carlo (MC) method and the energy distributions of sunlight within the different layers of human skin have been achieved and discussed. Finally, the simulation results are verified experimentally, which indicates that the proposed method will contribute to achieve a low-cost, convenient and safe method for recharging implantable biosensors.

  3. Development of the McGill simulator for endoscopic sinus surgery: a new high-fidelity virtual reality simulator for endoscopic sinus surgery.

    Science.gov (United States)

    Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A

    2014-01-01

    The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.

  4. Hydrologic Process Parameterization of Electrical Resistivity Imaging of Solute Plumes Using POD McMC

    Science.gov (United States)

    Awatey, M. T.; Irving, J.; Oware, E. K.

    2016-12-01

    Markov chain Monte Carlo (McMC) inversion frameworks are becoming increasingly popular in geophysics due to their ability to recover multiple equally plausible geologic features that honor the limited noisy measurements. Standard McMC methods, however, become computationally intractable with increasing dimensionality of the problem, for example, when working with spatially distributed geophysical parameter fields. We present a McMC approach based on a sparse proper orthogonal decomposition (POD) model parameterization that implicitly incorporates the physics of the underlying process. First, we generate training images (TIs) via Monte Carlo simulations of the target process constrained to a conceptual model. We then apply POD to construct basis vectors from the TIs. A small number of basis vectors can represent most of the variability in the TIs, leading to dimensionality reduction. A projection of the starting model into the reduced basis space generates the starting POD coefficients. At each iteration, only coefficients within a specified sampling window are resimulated assuming a Gaussian prior. The sampling window grows at a specified rate as the number of iteration progresses starting from the coefficients corresponding to the highest ranked basis to those of the least informative basis. We found this gradual increment in the sampling window to be more stable compared to resampling all the coefficients right from the first iteration. We demonstrate the performance of the algorithm with both synthetic and lab-scale electrical resistivity imaging of saline tracer experiments, employing the same set of basis vectors for all inversions. We consider two scenarios of unimodal and bimodal plumes. The unimodal plume is consistent with the hypothesis underlying the generation of the TIs whereas bimodality in plume morphology was not theorized. We show that uncertainty quantification using McMC can proceed in the reduced dimensionality space while accounting for the

  5. Designing new guides and instruments using McStas

    CERN Document Server

    Farhi, E; Wildes, A R; Ghosh, R; Lefmann, K

    2002-01-01

    With the increasing complexity of modern neutron-scattering instruments, the need for powerful tools to optimize their geometry and physical performances (flux, resolution, divergence, etc.) has become essential. As the usual analytical methods reach their limit of validity in the description of fine effects, the use of Monte Carlo simulations, which can handle these latter, has become widespread. The McStas program was developed at Riso National Laboratory in order to provide neutron scattering instrument scientists with an efficient and flexible tool for building Monte Carlo simulations of guides, neutron optics and instruments. To date, the McStas package has been extensively used at the Institut Laue-Langevin, Grenoble, France, for various studies including cold and thermal guides with ballistic geometry, diffractometers, triple-axis, backscattering and time-of-flight spectrometers. In this paper, we present some simulation results concerning different guide geometries that may be used in the future at th...

  6. MC/DC and Toggle Coverage Measurement Tool for FBD Program Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Jung, Se Jin; Kim, Jae Yeob; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2016-05-15

    The functional verification of FBD program can be implemented with various techniques such as testing and simulation. Simulation is preferable to verify FBD program, because it replicates operation of the PLC as well. The PLC is executed repeatedly as long as the controlled system is running based on scan time. Likewise, the simulation technique operates continuously and sequentially. Although engineers try to verify the functionality wholly, it is difficult to find residual errors in the design. Even if 100% functional coverage is accomplished, code coverage have 50%, which might indicate that the scenario is missing some key features of the design. Unfortunately, errors and bugs are often found in the missing points. To assure a high quality of functional verification, code coverage is important as well as functional coverage. We developed a pair tool 'FBDSim' and 'FBDCover' for FBD simulation and coverage measurement. The 'FBDSim' automatically simulates a set of FBD simulation scenarios. While the 'FBDSim' simulates the FBD program, it calculates the MC/DC and Toggle coverage and identifies unstimulated points. After FBD simulation is done, the 'FBDCover' reads the coverage results and shows the coverage with graphical feature and uncovered points with tree feature. The coverages and uncovered points can help engineers to improve the quality of simulation. We slightly dealt with the both coverages, but the coverage is dealt with more concrete and rigorous manner.

  7. McStas and Mantid integration

    DEFF Research Database (Denmark)

    Nielsen, T. R.; Markvardsen, A. J.; Willendrup, Peter Kjær

    2015-01-01

    McStas and Mantid are two well-established software frameworks within the neutron scattering community. McStas has been primarily used for simulating the neutron transport mechanisms in instruments, while Mantid has been primarily used for data reduction. We report here the status of our work don...

  8. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  9. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    Science.gov (United States)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by

  10. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    International Nuclear Information System (INIS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-01-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon–electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783–97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48–0.53% for the electron beam cases and 0.15–0.17% for the photon beam cases. In terms of efficiency, goMC was ∼4–16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was

  11. A Table-Based Random Sampling Simulation for Bioluminescence Tomography

    Directory of Open Access Journals (Sweden)

    Xiaomeng Zhang

    2006-01-01

    Full Text Available As a popular simulation of photon propagation in turbid media, the main problem of Monte Carlo (MC method is its cumbersome computation. In this work a table-based random sampling simulation (TBRS is proposed. The key idea of TBRS is to simplify multisteps of scattering to a single-step process, through randomly table querying, thus greatly reducing the computing complexity of the conventional MC algorithm and expediting the computation. The TBRS simulation is a fast algorithm of the conventional MC simulation of photon propagation. It retained the merits of flexibility and accuracy of conventional MC method and adapted well to complex geometric media and various source shapes. Both MC simulations were conducted in a homogeneous medium in our work. Also, we present a reconstructing approach to estimate the position of the fluorescent source based on the trial-and-error theory as a validation of the TBRS algorithm. Good agreement is found between the conventional MC simulation and the TBRS simulation.

  12. Dynamic bounds coupled with Monte Carlo simulations

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; Meester, L.E.; van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2011-01-01

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper

  13. Human reliability-based MC and A methods for evaluating the effectiveness of protecting nuclear material - 59379

    International Nuclear Information System (INIS)

    Duran, Felicia A.; Wyss, Gregory D.

    2012-01-01

    Material control and accountability (MC and A) operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. MC and A activities, from monitoring to inventory measurements, provide critical information about target materials and define security elements that are useful against insider threats. However, these activities have been difficult to characterize in ways that are compatible with the path analysis methods that are used to systematically evaluate the effectiveness of a site's protection system. The path analysis methodology focuses on a systematic, quantitative evaluation of the physical protection component of the system for potential external threats, and often calculates the probability that the physical protection system (PPS) is effective (PE) in defeating an adversary who uses that attack pathway. In previous work, Dawson and Hester observed that many MC and A activities can be considered a type of sensor system with alarm and assessment capabilities that provide recurring opportunities for 'detecting' the status of critical items. This work has extended that characterization of MC and A activities as probabilistic sensors that are interwoven within each protection layer of the PPS. In addition, MC and A activities have similar characteristics to operator tasks performed in a nuclear power plant (NPP) in that the reliability of these activities depends significantly on human performance. Many of the procedures involve human performance in checking for anomalous conditions. Further characterization of MC and A activities as operational procedures that check the status of critical assets provides a basis for applying human reliability analysis (HRA) models and methods to determine probabilities of detection for MC and A protection elements. This paper will discuss the application of HRA methods used in nuclear power plant probabilistic risk assessments to define detection

  14. Stochastic approach to municipal solid waste landfill life based on the contaminant transit time modeling using the Monte Carlo (MC) simulation

    International Nuclear Information System (INIS)

    Bieda, Bogusław

    2013-01-01

    The paper is concerned with application and benefits of MC simulation proposed for estimating the life of a modern municipal solid waste (MSW) landfill. The software Crystal Ball® (CB), simulation program that helps analyze the uncertainties associated with Microsoft® Excel models by MC simulation, was proposed to calculate the transit time contaminants in porous media. The transport of contaminants in soil is represented by the one-dimensional (1D) form of the advection–dispersion equation (ADE). The computer program CONTRANS written in MATLAB language is foundation to simulate and estimate the thickness of landfill compacted clay liner. In order to simplify the task of determining the uncertainty of parameters by the MC simulation, the parameters corresponding to the expression Z2 taken from this program were used for the study. The tested parameters are: hydraulic gradient (HG), hydraulic conductivity (HC), porosity (POROS), linear thickness (TH) and diffusion coefficient (EDC). The principal output report provided by CB and presented in the study consists of the frequency chart, percentiles summary and statistics summary. Additional CB options provide a sensitivity analysis with tornado diagrams. The data that was used include available published figures as well as data concerning the Mittal Steel Poland (MSP) S.A. in Kraków, Poland. This paper discusses the results and show that the presented approach is applicable for any MSW landfill compacted clay liner thickness design. -- Highlights: ► Numerical simulation of waste in porous media is proposed. ► Statistic outputs based on correct assumptions about probability distribution are presented. ► The benefits of a MC simulation are examined. ► The uniform probability distribution is studied. ► I report a useful tool applied to determine the life of a modern MSW landfill.

  15. Stochastic approach to municipal solid waste landfill life based on the contaminant transit time modeling using the Monte Carlo (MC) simulation

    Energy Technology Data Exchange (ETDEWEB)

    Bieda, Boguslaw, E-mail: bbieda@zarz.agh.edu.pl

    2013-01-01

    The paper is concerned with application and benefits of MC simulation proposed for estimating the life of a modern municipal solid waste (MSW) landfill. The software Crystal Ball Registered-Sign (CB), simulation program that helps analyze the uncertainties associated with Microsoft Registered-Sign Excel models by MC simulation, was proposed to calculate the transit time contaminants in porous media. The transport of contaminants in soil is represented by the one-dimensional (1D) form of the advection-dispersion equation (ADE). The computer program CONTRANS written in MATLAB language is foundation to simulate and estimate the thickness of landfill compacted clay liner. In order to simplify the task of determining the uncertainty of parameters by the MC simulation, the parameters corresponding to the expression Z2 taken from this program were used for the study. The tested parameters are: hydraulic gradient (HG), hydraulic conductivity (HC), porosity (POROS), linear thickness (TH) and diffusion coefficient (EDC). The principal output report provided by CB and presented in the study consists of the frequency chart, percentiles summary and statistics summary. Additional CB options provide a sensitivity analysis with tornado diagrams. The data that was used include available published figures as well as data concerning the Mittal Steel Poland (MSP) S.A. in Krakow, Poland. This paper discusses the results and show that the presented approach is applicable for any MSW landfill compacted clay liner thickness design. -- Highlights: Black-Right-Pointing-Pointer Numerical simulation of waste in porous media is proposed. Black-Right-Pointing-Pointer Statistic outputs based on correct assumptions about probability distribution are presented. Black-Right-Pointing-Pointer The benefits of a MC simulation are examined. Black-Right-Pointing-Pointer The uniform probability distribution is studied. Black-Right-Pointing-Pointer I report a useful tool applied to determine the life of a

  16. iFit: a new data analysis framework. Applications for data reduction and optimization of neutron scattering instrument simulations with McStas

    DEFF Research Database (Denmark)

    Farhi, E.; Y., Debab,; Willendrup, Peter Kjær

    2014-01-01

    and noisy problems. These optimizers can then be used to fit models onto data objects, and optimize McStas instrument simulations. As an application, we propose a methodology to analyse neutron scattering measurements in a pure Monte Carlo optimization procedure using McStas and iFit. As opposed...

  17. VERA Pin and Fuel Assembly Depletion Benchmark Calculations by McCARD and DeCART

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ho Jin; Cho, Jin Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Monte Carlo (MC) codes have been developed and used to simulate a neutron transport since MC method was devised in the Manhattan project. Solving the neutron transport problem with the MC method is simple and straightforward to understand. Because there are few essential approximations for the 6- dimension phase of a neutron such as the location, energy, and direction in MC calculations, highly accurate solutions can be obtained through such calculations. In this work, the VERA pin and fuel assembly (FA) depletion benchmark calculations are performed to examine the depletion capability of the newly generated DeCART multi-group cross section library. To obtain the reference solutions, MC depletion calculations are conducted using McCARD. Moreover, to scrutinize the effect by stochastic uncertainty propagation, uncertainty propagation analyses are performed using a sensitivity and uncertainty (S/U) analysis method and stochastic sampling (S.S) method. It is still expensive and challenging to perform a depletion analysis by a MC code. Nevertheless, many studies and works for a MC depletion analysis have been conducted to utilize the benefits of the MC method. In this study, McCARD MC and DeCART MOC transport calculations are performed for the VERA pin and FA depletion benchmarks. The DeCART depletion calculations are conducted to examine the depletion capability of the newly generated multi-group cross section library. The DeCART depletion calculations give excellent agreement with the McCARD reference one. From the McCARD results, it is observed that the MC depletion results depend on how to split the burnup interval. First, only to quantify the effect of the stochastic uncertainty propagation at 40 DTS, the uncertainty propagation analyses are performed using the S/U and S.S. method.

  18. Diffusion approximation-based simulation of stochastic ion channels: which method to use?

    Science.gov (United States)

    Pezo, Danilo; Soudry, Daniel; Orio, Patricio

    2014-01-01

    To study the effects of stochastic ion channel fluctuations on neural dynamics, several numerical implementation methods have been proposed. Gillespie's method for Markov Chains (MC) simulation is highly accurate, yet it becomes computationally intensive in the regime of a high number of channels. Many recent works aim to speed simulation time using the Langevin-based Diffusion Approximation (DA). Under this common theoretical approach, each implementation differs in how it handles various numerical difficulties—such as bounding of state variables to [0,1]. Here we review and test a set of the most recently published DA implementations (Goldwyn et al., 2011; Linaro et al., 2011; Dangerfield et al., 2012; Orio and Soudry, 2012; Schmandt and Galán, 2012; Güler, 2013; Huang et al., 2013a), comparing all of them in a set of numerical simulations that assess numerical accuracy and computational efficiency on three different models: (1) the original Hodgkin and Huxley model, (2) a model with faster sodium channels, and (3) a multi-compartmental model inspired in granular cells. We conclude that for a low number of channels (usually below 1000 per simulated compartment) one should use MC—which is the fastest and most accurate method. For a high number of channels, we recommend using the method by Orio and Soudry (2012), possibly combined with the method by Schmandt and Galán (2012) for increased speed and slightly reduced accuracy. Consequently, MC modeling may be the best method for detailed multicompartment neuron models—in which a model neuron with many thousands of channels is segmented into many compartments with a few hundred channels. PMID:25404914

  19. McXtrace

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Prodi, Andrea; Baltser, Jana

    2013-01-01

    to the standard X-ray simulation software SHADOW. McXtrace is open source, licensed under the General Public License, and does not require the user to have access to any proprietary software for its operation. The structure of the software is described in detail, and various examples are given to showcase...

  20. A single-column particle-resolved model for simulating the vertical distribution of aerosol mixing state: WRF-PartMC-MOSAIC-SCM v1.0

    Science.gov (United States)

    Curtis, Jeffrey H.; Riemer, Nicole; West, Matthew

    2017-11-01

    The PartMC-MOSAIC particle-resolved aerosol model was previously developed to predict the aerosol mixing state as it evolves in the atmosphere. However, the modeling framework was limited to a zero-dimensional box model approach without resolving spatial gradients in aerosol concentrations. This paper presents the development of stochastic particle methods to simulate turbulent diffusion and dry deposition of aerosol particles in a vertical column within the planetary boundary layer. The new model, WRF-PartMC-MOSAIC-SCM, resolves the vertical distribution of aerosol mixing state. We verified the new algorithms with analytical solutions for idealized test cases and illustrate the capabilities with results from a 2-day urban scenario that shows the evolution of black carbon mixing state in a vertical column.

  1. Simulation tools for scattering corrections in spectrally resolved x-ray computed tomography using McXtrace

    Science.gov (United States)

    Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer

    2018-03-01

    Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).

  2. McSUB V2.0, an upgraded version of the Monte Carlo library McSUB with inclusion of weight factors

    International Nuclear Information System (INIS)

    Hoek, M.

    1991-02-01

    The Monte Carlo library McSUB, which was described in an earlier report, has been upgraded to McSUB V2.0. McSUB V2.0 can be used to simulate the neutron transport in a medium which is a mixture of hydrogen and carbon or a mixture of deuterium and carbon. The implemented neutron energy interval is 0.1 - 20 MeV and the library can be used to simulate elastic and inelastic scattering. The inelastic scattering with carbon takes into account the four lowest excited states of the carbon nucleus. McSUB V2.0 is downward compatible with McSUB expect for the layout of the parameter file which now contains more variables. The major upgrade has been the inclusion of routines using weight factors which has speeded up the old version considerably. McSUB V2.0 also makes a biasing technique possible. It is now possible to e.g. let a neutron scatter with a selected nucleus followed by a biased scattering direction. (au)

  3. Monte Carlo simulation of the Tomotherapy treatment unit in the static mode using MC HAMMER, a Monte Carlo tool dedicated to Tomotherapy

    International Nuclear Information System (INIS)

    Sterpin, E; Tomsej, M; Cravens, B; Salvat, F; Ruchala, K; Olivera, G H; Vynckier, S

    2007-01-01

    Helical tomotherapy (HT) is designed to deliver highly modulated IMRT treatments. The concept of HT provides new challenges in MC simulation, because simultaneous movement of the gantry, the couch and the multi-leaf collimator (MLC) must be simulated accurately. However, before accounting for gantry, couch movement and multileaf collimator configurations, high accuracy must be achieved while simulating open static fields (1 x 40, 2.5 x 40 and 5 x 40 cm 2 ). This is performed using MC HAMMER, which is a graphical user interface allowing MC simulation using PENELOPE for various configurations of HT. Since the geometry of the different elements and materials involved in the beam generation are precisely known and defined, the only parameters that need to be tuned on are therefore electron source spot size and electron energy. Beyond the build up region, good agreement (2%/1mm) is achieved for all the field sizes between measurements (ion chamber) and simulations with an electron source energy set to 5.5 MeV. The electron source spot size is modelled as a gaussian distribution with full width half maximum equal to 1.4 mm. This value was chosen to match measured and calculated penumbras in the longitudinal direction

  4. On Analyzing LDPC Codes over Multiantenna MC-CDMA System

    Directory of Open Access Journals (Sweden)

    S. Suresh Kumar

    2014-01-01

    Full Text Available Multiantenna multicarrier code-division multiple access (MC-CDMA technique has been attracting much attention for designing future broadband wireless systems. In addition, low-density parity-check (LDPC code, a promising near-optimal error correction code, is also being widely considered in next generation communication systems. In this paper, we propose a simple method to construct a regular quasicyclic low-density parity-check (QC-LDPC code to improve the transmission performance over the precoded MC-CDMA system with limited feedback. Simulation results show that the coding gain of the proposed QC-LDPC codes is larger than that of the Reed-Solomon codes, and the performance of the multiantenna MC-CDMA system can be greatly improved by these QC-LDPC codes when the data rate is high.

  5. Sensibility of vagina reconstructed by McIndoe method in Mayer-Küster-Rokitansky-Hauser syndrome

    Directory of Open Access Journals (Sweden)

    Vesanović Svetlana

    2008-01-01

    Full Text Available Background/Aim. Congenital absence of vagina is a failure present in Mayer-Küster-Rokitansky-Hauser syndrome. Treatment of this anomaly includes nonoperative and operative procedures. McIndoe procedure uses split skin graft by Thiersch. The aim of this study was to determine sensitivity (touch, warmness, coldness of a vagina reconstructed by McIndoe method in Mayer-Küster-Rokitansky-Hauser syndrome and compare it with the normal vagina. Methods. A total of 21 female persons with reconstructed vagina by McIndoe method and 21 female persons with normal vagina were observed. All female persons were divided into groups and subgroups (according to age. Sensibility to touch, warmness and coldness were examined, applying VonFrey's esthesiometer and termoesthesiometer for warmness and coldness in three regions of vagina (enter, middle wall, bothom. The number of positive answers was registrated by touching the mucosa regions for five seconds, five times. Results. The obtained results showed that female patients with a reconstructed vagina by McIndoe method, felt touch at the middle part of wall and in the bottom of vagina better than patients with normal one. Also, the first ones felt warmness at the middle part of wall and coldness in the bottom of vagina, better than the patients with normal vagina. Other results showed no difference in sensibility between reconstructed and normal vagina. Conclusion. Various types of sensibility (touch, warmness, coldness are better or the same in vaginas reconstructed by McIndoe method, in comparison with normal ones. This could be explained by the fact that skin grafts are capable of recovering sensibility.

  6. A backward Monte Carlo method for efficient computation of runaway probabilities in runaway electron simulation

    Science.gov (United States)

    Zhang, Guannan; Del-Castillo-Negrete, Diego

    2017-10-01

    Kinetic descriptions of RE are usually based on the bounced-averaged Fokker-Planck model that determines the PDFs of RE. Despite of the simplification involved, the Fokker-Planck equation can rarely be solved analytically and direct numerical approaches (e.g., continuum and particle-based Monte Carlo (MC)) can be time consuming specially in the computation of asymptotic-type observable including the runaway probability, the slowing-down and runaway mean times, and the energy limit probability. Here we present a novel backward MC approach to these problems based on backward stochastic differential equations (BSDEs). The BSDE model can simultaneously describe the PDF of RE and the runaway probabilities by means of the well-known Feynman-Kac theory. The key ingredient of the backward MC algorithm is to place all the particles in a runaway state and simulate them backward from the terminal time to the initial time. As such, our approach can provide much faster convergence than the brute-force MC methods, which can significantly reduce the number of particles required to achieve a prescribed accuracy. Moreover, our algorithm can be parallelized as easy as the direct MC code, which paves the way for conducting large-scale RE simulation. This work is supported by DOE FES and ASCR under the Contract Numbers ERKJ320 and ERAT377.

  7. Application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) to the steel process chain: case study.

    Science.gov (United States)

    Bieda, Bogusław

    2014-05-15

    The purpose of the paper is to present the results of application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) data of Mittal Steel Poland (MSP) complex in Kraków, Poland. In order to assess the uncertainty, the software CrystalBall® (CB), which is associated with Microsoft® Excel spreadsheet model, is used. The framework of the study was originally carried out for 2005. The total production of steel, coke, pig iron, sinter, slabs from continuous steel casting (CSC), sheets from hot rolling mill (HRM) and blast furnace gas, collected in 2005 from MSP was analyzed and used for MC simulation of the LCI model. In order to describe random nature of all main products used in this study, normal distribution has been applied. The results of the simulation (10,000 trials) performed with the use of CB consist of frequency charts and statistical reports. The results of this study can be used as the first step in performing a full LCA analysis in the steel industry. Further, it is concluded that the stochastic approach is a powerful method for quantifying parameter uncertainty in LCA/LCI studies and it can be applied to any steel industry. The results obtained from this study can help practitioners and decision-makers in the steel production management. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. SU-D-BRC-01: An Automatic Beam Model Commissioning Method for Monte Carlo Simulations in Pencil-Beam Scanning Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Qin, N; Shen, C; Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Ctr, Dallas, TX (United States)

    2016-06-15

    Purpose: Monte Carlo (MC) simulation is typically regarded as the most accurate dose calculation method for proton therapy. Yet for real clinical cases, the overall accuracy also depends on that of the MC beam model. Commissioning a beam model to faithfully represent a real beam requires finely tuning a set of model parameters, which could be tedious given the large number of pencil beams to commmission. This abstract reports an automatic beam-model commissioning method for pencil-beam scanning proton therapy via an optimization approach. Methods: We modeled a real pencil beam with energy and spatial spread following Gaussian distributions. Mean energy, and energy and spatial spread are model parameters. To commission against a real beam, we first performed MC simulations to calculate dose distributions of a set of ideal (monoenergetic, zero-size) pencil beams. Dose distribution for a real pencil beam is hence linear superposition of doses for those ideal pencil beams with weights in the Gaussian form. We formulated the commissioning task as an optimization problem, such that the calculated central axis depth dose and lateral profiles at several depths match corresponding measurements. An iterative algorithm combining conjugate gradient method and parameter fitting was employed to solve the optimization problem. We validated our method in simulation studies. Results: We calculated dose distributions for three real pencil beams with nominal energies 83, 147 and 199 MeV using realistic beam parameters. These data were regarded as measurements and used for commission. After commissioning, average difference in energy and beam spread between determined values and ground truth were 4.6% and 0.2%. With the commissioned model, we recomputed dose. Mean dose differences from measurements were 0.64%, 0.20% and 0.25%. Conclusion: The developed automatic MC beam-model commissioning method for pencil-beam scanning proton therapy can determine beam model parameters with

  9. Steady-state molecular dynamics simulation of vapor to liquid nucleation with Mc Donald's demon

    International Nuclear Information System (INIS)

    Horsch, M.; Miroshnichenko, S.; Vrabec, J.

    2009-01-01

    Grand canonical MD with McDonald's demon is discussed in the present contribution and applied for sampling both nucleation kinetics and steady-state properties of a supersaturated vapor. The idea behind the new approach is to simulate the production of clusters up to a given size for a specified supersaturation. The classical nucleation theory is found to overestimate the free energy of cluster formation and deviate by two orders of magnitude from the nucleation rate below the triple point at high supersaturations.

  10. Bayesian Monte Carlo method

    International Nuclear Information System (INIS)

    Rajabalinejad, M.

    2010-01-01

    To reduce cost of Monte Carlo (MC) simulations for time-consuming processes, Bayesian Monte Carlo (BMC) is introduced in this paper. The BMC method reduces number of realizations in MC according to the desired accuracy level. BMC also provides a possibility of considering more priors. In other words, different priors can be integrated into one model by using BMC to further reduce cost of simulations. This study suggests speeding up the simulation process by considering the logical dependence of neighboring points as prior information. This information is used in the BMC method to produce a predictive tool through the simulation process. The general methodology and algorithm of BMC method are presented in this paper. The BMC method is applied to the simplified break water model as well as the finite element model of 17th Street Canal in New Orleans, and the results are compared with the MC and Dynamic Bounds methods.

  11. Molecular Simulation towards Efficient and Representative Subsurface Reservoirs Modeling

    KAUST Repository

    Kadoura, Ahmad Salim

    2016-01-01

    This dissertation focuses on the application of Monte Carlo (MC) molecular simulation and Molecular Dynamics (MD) in modeling thermodynamics and flow of subsurface reservoir fluids. At first, MC molecular simulation is proposed as a promising method

  12. Shaping ability of NT Engine and McXim rotary nickel-titanium instruments in simulated root canals. Part 1.

    Science.gov (United States)

    Thompson, S A; Dummer, P M

    1997-07-01

    The aim of this study was to determine the shaping ability of NT Engine and McXim nickel-titanium rotary instruments in simulated root canals. In all, 40 canals consisting of four different shapes in terms of angle and position of curvature were prepared by a combination of NT Engine and McXim instruments using the technique recommended by the manufacturer. Part 1 of this two-part report describes the efficacy of the instruments in terms of preparation time, instrument failure, canal blockages, loss of canal length and three-dimensional canal form. Overall, the mean preparation time for all canals was 6.01 min, with canal shape having a significant effect (P Engine and McXim instruments prepared canals rapidly, with few deformations, no canal blockages and with minimal change in working length. The three-dimensional form of the canals demonstrated good flow and taper characteristics.

  13. Hybrid method coupling molecular dynamics and Monte Carlo simulations to study the properties of gases in microchannels and nanochannels

    NARCIS (Netherlands)

    Nedea, S.V.; Frijns, A.J.H.; Steenhoven, van A.A.; Markvoort, Albert. J.; Hilbers, P.A.J.

    2005-01-01

    We combine molecular dynamics (MD) and Monte Carlo (MC) simulations to study the properties of gas molecules confined between two hard walls of a microchannel or nanochannel. The coupling between MD and MC simulations is introduced by performing MD near the boundaries for accuracy and MC in the bulk

  14. Application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) to the steel process chain: Case study

    Energy Technology Data Exchange (ETDEWEB)

    Bieda, Bogusław

    2014-05-01

    The purpose of the paper is to present the results of application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) data of Mittal Steel Poland (MSP) complex in Kraków, Poland. In order to assess the uncertainty, the software CrystalBall® (CB), which is associated with Microsoft® Excel spreadsheet model, is used. The framework of the study was originally carried out for 2005. The total production of steel, coke, pig iron, sinter, slabs from continuous steel casting (CSC), sheets from hot rolling mill (HRM) and blast furnace gas, collected in 2005 from MSP was analyzed and used for MC simulation of the LCI model. In order to describe random nature of all main products used in this study, normal distribution has been applied. The results of the simulation (10,000 trials) performed with the use of CB consist of frequency charts and statistical reports. The results of this study can be used as the first step in performing a full LCA analysis in the steel industry. Further, it is concluded that the stochastic approach is a powerful method for quantifying parameter uncertainty in LCA/LCI studies and it can be applied to any steel industry. The results obtained from this study can help practitioners and decision-makers in the steel production management. - Highlights: • The benefits of Monte Carlo simulation are examined. • The normal probability distribution is studied. • LCI data on Mittal Steel Poland (MSP) complex in Kraków, Poland dates back to 2005. • This is the first assessment of the LCI uncertainties in the Polish steel industry.

  15. Molecular Simulation towards Efficient and Representative Subsurface Reservoirs Modeling

    KAUST Repository

    Kadoura, Ahmad

    2016-09-01

    This dissertation focuses on the application of Monte Carlo (MC) molecular simulation and Molecular Dynamics (MD) in modeling thermodynamics and flow of subsurface reservoir fluids. At first, MC molecular simulation is proposed as a promising method to replace correlations and equations of state in subsurface flow simulators. In order to accelerate MC simulations, a set of early rejection schemes (conservative, hybrid, and non-conservative) in addition to extrapolation methods through reweighting and reconstruction of pre-generated MC Markov chains were developed. Furthermore, an extensive study was conducted to investigate sorption and transport processes of methane, carbon dioxide, water, and their mixtures in the inorganic part of shale using both MC and MD simulations. These simulations covered a wide range of thermodynamic conditions, pore sizes, and fluid compositions shedding light on several interesting findings. For example, the possibility to have more carbon dioxide adsorbed with more preadsorbed water concentrations at relatively large basal spaces. The dissertation is divided into four chapters. The first chapter corresponds to the introductory part where a brief background about molecular simulation and motivations are given. The second chapter is devoted to discuss the theoretical aspects and methodology of the proposed MC speeding up techniques in addition to the corresponding results leading to the successful multi-scale simulation of the compressible single-phase flow scenario. In chapter 3, the results regarding our extensive study on shale gas at laboratory conditions are reported. At the fourth and last chapter, we end the dissertation with few concluding remarks highlighting the key findings and summarizing the future directions.

  16. Atomistic Monte Carlo Simulation of Lipid Membranes

    Directory of Open Access Journals (Sweden)

    Daniel Wüstner

    2014-01-01

    Full Text Available Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA for the phospholipid dipalmitoylphosphatidylcholine (DPPC. We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol.

  17. Application of stochastic approach based on Monte Carlo (MC) simulation for life cycle inventory (LCI) of the rare earth elements (REEs) in beneficiation rare earth waste from the gold processing: case study

    Science.gov (United States)

    Bieda, Bogusław; Grzesik, Katarzyna

    2017-11-01

    The study proposes an stochastic approach based on Monte Carlo (MC) simulation for life cycle assessment (LCA) method limited to life cycle inventory (LCI) study for rare earth elements (REEs) recovery from the secondary materials processes production applied to the New Krankberg Mine in Sweden. The MC method is recognizes as an important tool in science and can be considered the most effective quantification approach for uncertainties. The use of stochastic approach helps to characterize the uncertainties better than deterministic method. Uncertainty of data can be expressed through a definition of probability distribution of that data (e.g. through standard deviation or variance). The data used in this study are obtained from: (i) site-specific measured or calculated data, (ii) values based on literature, (iii) the ecoinvent process "rare earth concentrate, 70% REO, from bastnäsite, at beneficiation". Environmental emissions (e.g, particulates, uranium-238, thorium-232), energy and REE (La, Ce, Nd, Pr, Sm, Dy, Eu, Tb, Y, Sc, Yb, Lu, Tm, Y, Gd) have been inventoried. The study is based on a reference case for the year 2016. The combination of MC analysis with sensitivity analysis is the best solution for quantified the uncertainty in the LCI/LCA. The reliability of LCA results may be uncertain, to a certain degree, but this uncertainty can be noticed with the help of MC method.

  18. Technical note: Comparison of metal-on-metal hip simulator wear measured by gravimetric, CMM and optical profiling methods

    Science.gov (United States)

    Alberts, L. Russell; Martinez-Nogues, Vanesa; Baker Cook, Richard; Maul, Christian; Bills, Paul; Racasan, R.; Stolz, Martin; Wood, Robert J. K.

    2018-03-01

    Simulation of wear in artificial joint implants is critical for evaluating implant designs and materials. Traditional protocols employ the gravimetric method to determine the loss of material by measuring the weight of the implant components before and after various test intervals and after the completed test. However, the gravimetric method cannot identify the location, area coverage or maximum depth of the wear and it has difficulties with proportionally small weight changes in relatively heavy implants. In this study, we compare the gravimetric method with two geometric surface methods; an optical light method (RedLux) and a coordinate measuring method (CMM). We tested ten Adept hips in a simulator for 2 million cycles (MC). Gravimetric and optical methods were performed at 0.33, 0.66, 1.00, 1.33 and 2 MC. CMM measurements were done before and after the test. A high correlation was found between the gravimetric and optical methods for both heads (R 2  =  0.997) and for cups (R 2  =  0.96). Both geometric methods (optical and CMM) measured more volume loss than the gravimetric method (for the heads, p  =  0.004 (optical) and p  =  0.08 (CMM); for the cups p  =  0.01 (optical) and p  =  0.003 (CMM)). Two cups recorded negative wear at 2 MC by the gravimetric method but none did by either the optical method or by CMM. The geometric methods were prone to confounding factors such as surface deformation and the gravimetric method could be confounded by protein absorption and backside wear. Both of the geometric methods were able to show the location, area covered and depth of the wear on the bearing surfaces, and track their changes during the test run; providing significant advantages to solely using the gravimetric method.

  19. Anatomic and histological characteristics of vagina reconstructed by McIndoe method

    Directory of Open Access Journals (Sweden)

    Kozarski Jefta

    2009-01-01

    Full Text Available Background/Aim. Congenital absence of vagina is known from ancient times of Greek. According to the literature data, incidence is 1/4 000 to 1/20 000. Treatment of this anomaly includes non-operative and operative procedures. McIndoe procedure uses split skin graft by Thiersch. The aim of this study was to establish anatomic and histological characteristics of vagina reconstructed by McIndoe method in Mayer Küster-Rockitansky Hauser (MKRH syndrome and compare them with normal vagina. Methods. The study included 21 patients of 18 and more years with congenital anomaly known as aplasio vaginae within the Mayer Küster-Rockitansky Hauser syndrome. The patients were operated on by the plastic surgeon using the McIndoe method. The study was a retrospective review of the data from the history of the disease, objective and gynecological examination and cytological analysis of native preparations of vaginal stain (Papanicolau. Comparatively, 21 females of 18 and more years with normal vaginas were also studied. All the subjects were divided into the groups R (reconstructed and C (control and the subgroups according to age up to 30 years (1 R, 1C, from 30 to 50 (2R, 2C, and over 50 (3R, 3C. Statistical data processing was performed by using the Student's t-test and Mann-Writney U-test. A value of p < 0.05 was considered statistically significant. Results. The results show that there are differences in the depth and the wideness of reconstructed vagina, but the obtained values are still in the range of normal ones. Cytological differences between a reconstructed and the normal vagina were found. Conclusion. A reconstructed vagina is smaller than the normal one regarding depth and width, but within the range of normal values. A split skin graft used in the reconstruction, keeps its own cytological, i.e. histological and, so, biological characteristics.

  20. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....

  1. Assessing the convergence of LHS Monte Carlo simulations of wastewater treatment models.

    Science.gov (United States)

    Benedetti, Lorenzo; Claeys, Filip; Nopens, Ingmar; Vanrolleghem, Peter A

    2011-01-01

    Monte Carlo (MC) simulation appears to be the only currently adopted tool to estimate global sensitivities and uncertainties in wastewater treatment modelling. Such models are highly complex, dynamic and non-linear, requiring long computation times, especially in the scope of MC simulation, due to the large number of simulations usually required. However, no stopping rule to decide on the number of simulations required to achieve a given confidence in the MC simulation results has been adopted so far in the field. In this work, a pragmatic method is proposed to minimize the computation time by using a combination of several criteria. It makes no use of prior knowledge about the model, is very simple, intuitive and can be automated: all convenient features in engineering applications. A case study is used to show an application of the method, and the results indicate that the required number of simulations strongly depends on the model output(s) selected, and on the type and desired accuracy of the analysis conducted. Hence, no prior indication is available regarding the necessary number of MC simulations, but the proposed method is capable of dealing with these variations and stopping the calculations after convergence is reached.

  2. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...

  3. The McGill simulator for endoscopic sinus surgery (MSESS): a validation study.

    Science.gov (United States)

    Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Saad, Elias; Funnell, W Robert J; Tewfik, Marc A

    2014-10-24

    Endoscopic sinus surgery (ESS) is a technically challenging procedure, associated with a significant risk of complications. Virtual reality simulation has demonstrated benefit in many disciplines as an important educational tool for surgical training. Within the field of rhinology, there is a lack of ESS simulators with appropriate validity evidence supporting their integration into residency education. The objectives of this study are to evaluate the acceptability, perceived realism and benefit of the McGill Simulator for Endoscopic Sinus Surgery (MSESS) among medical students, otolaryngology residents and faculty, and to present evidence supporting its ability to differentiate users based on their level of training through the performance metrics. 10 medical students, 10 junior residents, 10 senior residents and 3 expert sinus surgeons performed anterior ethmoidectomies, posterior ethmoidectomies and wide sphenoidotomies on the MSESS. Performance metrics related to quality (e.g. percentage of tissue removed), efficiency (e.g. time, path length, bimanual dexterity, etc.) and safety (e.g. contact with no-go zones, maximum applied force, etc.) were calculated. All users completed a post-simulation questionnaire related to realism, usefulness and perceived benefits of training on the MSESS. The MSESS was found to be realistic and useful for training surgical skills with scores of 7.97 ± 0.29 and 8.57 ± 0.69, respectively on a 10-point rating scale. Most students and residents (29/30) believed that it should be incorporated into their curriculum. There were significant differences between novice surgeons (10 medical students and 10 junior residents) and senior surgeons (10 senior residents and 3 sinus surgeons) in performance metrics related to quality (p education. This simulator may be a potential resource to help fill the void in endoscopic sinus surgery training.

  4. Crew-MC communication and characteristics of crewmembers' sleep under conditions of simulated prolonged space flight

    Science.gov (United States)

    Shved, Dmitry; Gushin, Vadim; Yusupova, Anna; Ehmann, Bea; Balazs, Laszlo; Zavalko, Irina

    Characteristics of crew-MC communication and psychophysiological state of the crewmembers were studied in simulation experiment with 520-day isolation. We used method of computerized quantitative content analysis to investigate psychologically relevant characteristics of the crew’s messages content. Content analysis is a systematic, reproducible method of reducing of a text array to a limited number of categories by means of preset scientifically substantiated rules of coding (Berelson, 1971, Krippendorff, 2004). All statements in the crew’s messages to MC were coded with certain psychologically relevant content analysis categories (e.g. ‘Needs’, ‘Negativism’, ‘Time’). We attributed to the ‘Needs’ category statements (semantic units), containing the words, related to subject’s needs and their satisfaction, e.g. ‘‘necessary, need, wish, want, demand’’. To the ‘Negativism’ category we refer critical statements, containing such words as ‘‘mistakes, faults, deficit, shortage’’. The ‘Time’ category embodies statements related to time perception, e.g. “hour, day, always, never, constantly”. Sleep study was conducted with use of EEG and actigraphy techniques to assess characteristics of the crewmembers’ night sleep, reflecting the crew’s adaptation to the experimental conditions. The overall amount of communication (quantity of messages and their length) positively correlated with sleep effectiveness (time of sleep related to time in bed) and with delta sleep latency. Occurrences of semantic units in categories ‘Time’ and ‘Negativism’ negatively correlated with sleep latency, and positively - with delta sleep latency and sleep effectiveness. Frequency of time-related semantic units’ utilization in the crew’s messages was significantly increasing during or before the key events of the experiment (beginning of high autonomy, planetary landing simulation, etc.). It is known that subjective importance of time

  5. Evaluation of the McMahon Competence Assessment Instrument for Use with Midwifery Students During a Simulated Shoulder Dystocia.

    Science.gov (United States)

    McMahon, Erin; Jevitt, Cecilia; Aronson, Barbara

    2018-03-01

    Intrapartum emergencies occur infrequently but require a prompt and competent response from the midwife to prevent morbidity and mortality of the woman, fetus, and newborn. Simulation provides the opportunity for student midwives to develop competence in a safe environment. The purpose of this study was to determine the inter-rater reliability of the McMahon Competence Assessment Instrument (MCAI) for use with student midwives during a simulated shoulder dystocia scenario. A pilot study using a nonprobability convenience sample was used to evaluate the MCAI. Content validity indices were calculated for the individual items and the overall instrument using data from a panel of expert reviewers. Fourteen student midwives consented to be video recorded while participating in a simulated shoulder dystocia scenario. Three faculty raters used the MCAI to evaluate the student performance. These quantitative data were used to determine the inter-rater reliability of the MCAI. The intraclass correlation coefficient (ICC) was used to assess the inter-rater reliability of MCAI scores between 2 or more raters. The ICC was 0.86 (95% confidence interval, 0.60-0.96). Fleiss's kappa was calculated to determine the inter-rater reliability for individual items. Twenty-three of the 42 items corresponded to excellent strength of agreement. This study demonstrates a method to determine the inter-rater reliability of a competence assessment instrument to be used with student midwives. Data produced by this study were used to revise and improve the instrument. Additional research will further document the inter-rater reliability and can be used to determine changes in student competence. Valid and reliable methods of assessment will encourage the use of simulation to efficiently develop the competence of student midwives. © 2018 by the American College of Nurse-Midwives.

  6. Study of new scaling of direct photon production in pp collisions at high energies using MC simulation

    International Nuclear Information System (INIS)

    Tokarev, M.V.; Potrebenikova, E.V.

    1998-01-01

    The new scaling, z scaling, of prompt photon production in pp collisions at high energies is studied. The scaling function H(z) is expressed via the inclusive cross section of photon production Ed 3 σ / dq 3 and the multiplicity density of charged particles, ρ (s), at pseudorapidity η = 0. Monte Carlo (MC) simulation based on the PYTHIA code is used to calculate the cross section and to verify the scaling. The MC technique used to construct the scaling function is described. The H (z) dependence on the scaling variable z, the center-of-mass energy √ s at a produced angle of θ = 90 deg is investigated. The predictions of the Ed 3 σ / dq 3 dependence on transverse momentum q at a colliding energy of √ s = 0.5, 5.0 and 14.0 TeV are made. The obtained results are compared with the experimental data and can be of interest for future experiments at RHIC (BNL). LHC (CERN), HERA (DESY) and Tevatron (Batavia)

  7. Treatment plan evaluation for interstitial photodynamic therapy in a mouse model by Monte Carlo simulation with FullMonte

    Directory of Open Access Journals (Sweden)

    Jeffrey eCassidy

    2015-02-01

    Full Text Available Monte Carlo (MC simulation is recognized as the gold standard for biophotonic simulation, capturing all relevant physics and material properties at the perceived cost of high computing demands. Tetrahedral-mesh-based MC simulations particularly are attractive due to the ability to refine the mesh at will to conform to complicated geometries or user-defined resolution requirements. Since no approximations of material or light-source properties are required, MC methods are applicable to the broadest set of biophotonic simulation problems. MC methods also have other implementation features including inherent parallelism, and permit a continuously-variable quality-runtime tradeoff. We demonstrate here a complete MC-based prospective fluence dose evaluation system for interstitial PDT to generate dose-volume histograms on a tetrahedral mesh geometry description. To our knowledge, this is the first such system for general interstitial photodynamic therapy employing MC methods and is therefore applicable to a very broad cross-section of anatomy and material properties. We demonstrate that evaluation of dose-volume histograms is an effective variance-reduction scheme in its own right which greatly reduces the number of packets required and hence runtime required to achieve acceptable result confidence. We conclude that MC methods are feasible for general PDT treatment evaluation and planning, and considerably less costly than widely believed.

  8. Dose rates from a C-14 source using extrapolation chamber and MC calculations

    International Nuclear Information System (INIS)

    Borg, J.

    1996-05-01

    The extrapolation chamber technique and the Monte Carlo (MC) calculation technique based on the EGS4 system have been studied for application for determination of dose rates in a low-energy β radiation field e.g., that from a 14 C source. The extrapolation chamber measurement method is the basic method for determination of dose rates in β radiation fields. Applying a number of correction factors and the stopping power ratio, tissue to air, the measured dose rate in an air volume surrounded by tissue equivalent material is converted into dose to tissue. Various details of the extrapolation chamber measurement method and evaluation procedure have been studied and further developed, and a complete procedure for the experimental determination of dose rates from a 14 C source is presented. A number of correction factors and other parameters used in the evaluation procedure for the measured data have been obtained by MC calculations. The whole extrapolation chamber measurement procedure was simulated using the MC method. The measured dose rates showed an increasing deviation from the MC calculated dose rates as the absorber thickness increased. This indicates that the EGS4 code may have some limitations for transport of very low-energy electrons. i.e., electrons with estimated energies less than 10 - 20 keV. MC calculations of dose to tissue were performed using two models: a cylindrical tissue phantom and a computer model of the extrapolation chamber. The dose to tissue in the extrapolation chamber model showed an additional buildup dose compared to the dose in the tissue model. (au) 10 tabs., 11 ills., 18 refs

  9. MC 93 - Proceedings of the International Conference on Monte Carlo Simulation in High Energy and Nuclear Physics

    Science.gov (United States)

    Dragovitsch, Peter; Linn, Stephan L.; Burbank, Mimi

    1994-01-01

    Calorimeter Geometry * Simulations with EGS4/PRESTA for Thin Si Sampling Calorimeter * SIBERIA -- Monte Carlo Code for Simulation of Hadron-Nuclei Interactions * CALOR89 Predictions for the Hanging File Test Configurations * Estimation of the Multiple Coulomb Scattering Error for Various Numbers of Radiation Lengths * Monte Carlo Generator for Nuclear Fragmentation Induced by Pion Capture * Calculation and Randomization of Hadron-Nucleus Reaction Cross Section * Developments in GEANT Physics * Status of the MC++ Event Generator Toolkit * Theoretical Overview of QCD Event Generators * Random Numbers? * Simulation of the GEM LKr Barrel Calorimeter Using CALOR89 * Recent Improvement of the EGS4 Code, Implementation of Linearly Polarized Photon Scattering * Interior-Flux Simulation in Enclosures with Electron-Emitting Walls * Some Recent Developments in Global Determinations of Parton Distributions * Summary of the Workshop on Simulating Accelerator Radiation Environments * Simulating the SDC Radiation Background and Activation * Applications of Cluster Monte Carlo Method to Lattice Spin Models * PDFLIB: A Library of All Available Parton Density Functions of the Nucleon, the Pion and the Photon and the Corresponding αs Calculations * DTUJET92: Sampling Hadron Production at Supercolliders * A New Model for Hadronic Interactions at Intermediate Energies for the FLUKA Code * Matrix Generator of Pseudo-Random Numbers * The OPAL Monte Carlo Production System * Monte Carlo Simulation of the Microstrip Gas Counter * Inner Detector Simulations in ATLAS * Simulation and Reconstruction in H1 Liquid Argon Calorimetry * Polarization Decomposition of Fluxes and Kinematics in ep Reactions * Towards Object-Oriented GEANT -- ProdiG Project * Parallel Processing of AMY Detector Simulation on Fujitsu AP1000 * Enigma: An Event Generator for Electron-Photon- or Pion-Induced Events in the ~1 GeV Region * SSCSIM: Development and Use by the Fermilab SDC Group * The GEANT-CALOR Interface

  10. Shaping ability of NT Engine and McXim rotary nickel-titanium instruments in simulated root canals. Part 2.

    Science.gov (United States)

    Thompson, S A; Dummer, P M

    1997-07-01

    The aim of this laboratory-based study was to determine the shaping ability of NT Engine and McXim nickel-titanium rotary instruments in simulated root canals. A total of 40 canals with four different shapes in terms of angle and position of curve were prepared with NT Engine and McXim instruments, using the technique recommended by the manufacturer. Part 2 of this report describes the efficacy of the instruments in terms of prevalence of canal aberrations, the amount and direction of canal transportation and overall postoperative shape. Pre- and postoperative images of the canals were taken using a video camera attached to a computer with image analysis software. The pre- and postoperative views were superimposed to highlight the amount and position of material removed during preparation. No zips, elbows, perforations or danger zones were created during preparation. Forty-two per cent of canals had ledges on the outer aspect of the curve, the majority of which (16 out of 17) occurred in canals with short acute curves. There were significant differences (P Engine and McXim rotary nickel-titanium instruments created no aberrations other than ledges and produced only minimal transportation. The overall shape of canals was good.

  11. Validation of Shielding Analysis Capability of SuperMC with SINBAD

    Directory of Open Access Journals (Sweden)

    Chen Chaobin

    2017-01-01

    Full Text Available Abstract: The shielding analysis capability of SuperMC was validated with the Shielding Integral Benchmark Archive Database (SINBAD. The SINBAD was compiled by RSICC and NEA, it includes numerous benchmark experiments performed with the D-T fusion neutron source facilities of OKTAVIAN, FNS, IPPE, etc. The results from SuperMC simulation were compared with experimental data and MCNP results. Very good agreement with deviation lower than 1% was achieved and it suggests that SuperMC is reliable in shielding calculation.

  12. APLIKASI ANALISIS DISKRIMINAN DALAM MENENTUKAN KEPUTUSAN PEMBELIAN PRODUK McCafe (Studi Kasus: McDonald’s Jimbaran Bali

    Directory of Open Access Journals (Sweden)

    TRISNA RAMADHAN

    2018-02-01

    Full Text Available McDonald’s is one of fast food company that is growing rapidly. McDonald’s continues to innovate to satisfy customers. It introduced the concept of a cafe with the name McCafe. Because of the competition with other fast food restaurants, McDonald’s needs to improve the quality of McCafe favored by customers. Thus, this research was conducted to aim at getting the indicators that are best describing customers characteristic. This research used discriminant analysis methods. Discriminant analysis was used to classify customers into groups of loyal customers or non loyal customers.. The indicators that distinguished the decision of the customer to buy McCafe Jimbaran product were affordable prices and locations that are easily accessible to customers. The formed discriminant function had an accuracy of 91,67 percent in classifying the customers.

  13. Rapid resolution of chronic shoulder pain classified as derangement using the McKenzie method: a case series

    Science.gov (United States)

    Aytona, Maria Corazon; Dudley, Karlene

    2013-01-01

    The McKenzie method, also known as Mechanical Diagnosis and Therapy (MDT), is primarily recognized as an evaluation and treatment method for the spine. However, McKenzie suggested that this method could also be applied to the extremities. Derangement is an MDT classification defined as an anatomical disturbance in the normal resting position of the joint, and McKenzie proposed that repeated movements could be applied to reduce internal joint displacement and rapidly reduce derangement symptoms. However, the current literature on MDT application to shoulder disorders is limited. Here, we present a case series involving four patients with chronic shoulder pain from a duration of 2–18 months classified as derangement and treated using MDT principles. Each patient underwent mechanical assessment and was treated with repeated movements based on their directional preference. All patients demonstrated rapid and clinically significant improvement in baseline measures and the disabilities of the arm, shoulder, and hand (QuickDASH) scores from an average of 38% at initial evaluation to 5% at discharge within 3–5 visits. Our findings suggest that MDT may be an effective treatment approach for shoulder pain. PMID:24421633

  14. Speeding up Monte Carlo molecular simulation by a non-conservative early rejection scheme

    KAUST Repository

    Kadoura, Ahmad Salim

    2015-04-23

    Monte Carlo (MC) molecular simulation describes fluid systems with rich information, and it is capable of predicting many fluid properties of engineering interest. In general, it is more accurate and representative than equations of state. On the other hand, it requires much more computational effort and simulation time. For that purpose, several techniques have been developed in order to speed up MC molecular simulations while preserving their precision. In particular, early rejection schemes are capable of reducing computational cost by reaching the rejection decision for the undesired MC trials at an earlier stage in comparison to the conventional scheme. In a recent work, we have introduced a ‘conservative’ early rejection scheme as a method to accelerate MC simulations while producing exactly the same results as the conventional algorithm. In this paper, we introduce a ‘non-conservative’ early rejection scheme, which is much faster than the conservative scheme, yet it preserves the precision of the method. The proposed scheme is tested for systems of structureless Lennard-Jones particles in both canonical and NVT-Gibbs ensembles. Numerical experiments were conducted at several thermodynamic conditions for different number of particles. Results show that at certain thermodynamic conditions, the non-conservative method is capable of doubling the speed of the MC molecular simulations in both canonical and NVT-Gibbs ensembles. © 2015 Taylor & Francis

  15. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Y., E-mail: yican.wu@fds.org.cn [Inst. of Nuclear Energy Safety Technology, Hefei, Anhui (China)

    2015-07-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  16. Development and applications of Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems

    International Nuclear Information System (INIS)

    Wu, Y.

    2015-01-01

    'Full text:' Super Monte Carlo Simulation Program for Advanced Nuclear Energy Systems (SuperMC) is a CAD-based Monte Carlo (MC) program for integrated simulation of nuclear system by making use of hybrid MC-deterministic method and advanced computer technologies. The main usability features are automatic modeling of geometry and physics, visualization and virtual simulation and cloud computing service. SuperMC 2.3, the latest version, can perform coupled neutron and photon transport calculation. SuperMC has been verified by more than 2000 benchmark models and experiments, and has been applied in tens of major nuclear projects, such as the nuclear design and analysis of International Thermonuclear Experimental Reactor (ITER) and China Lead-based reactor (CLEAR). Development and applications of SuperMC are introduced in this presentation. (author)

  17. MC-DS-CDMA System based on DWT and STBC in ITU Multipath Fading Channels Model

    Directory of Open Access Journals (Sweden)

    Nader Abdullah Khadam

    2018-03-01

    Full Text Available In this paper, the performance of multicarrier direct sequence code division multiple access (MC-DS-CDMA in fixed MC-DS-CDMA and Mobile MC-DS-CDMA applications have been improved by using the compensations of space time block coding and Discrete Fast Fourier transforms (FFT or Discrete Wavelets transform DWT. These MC-DS-CDMA systems had been simulated using MATLAB 2015a. Through simulation of the proposed system, various parameters can be changed and tested. The Bit Error Rate (BERs of these systems are obtained over wide range of signal to noise ratio. All simulation results had been compared with each other using different subcarrier size of FFT or DWT with STBC for 1,2,3 and 4 antennas in transmitter and under different ITU multipath fading channels and different Doppler frequencies (fd. The proposed structures of STBC-MC-DS-CDMA system based on (DWT batter than based on (FFT in varies Doppler frequencies and subcarrier size. Also, proposed system with STBC based on 4 transmitters better than other systems based on 1 or 2 or 3 transmitters in all Doppler frequencies and subcarrier size in all simulation results.

  18. Monte Carlo Simulation of Complete X-Ray Spectra for Use in Scanning Electron Microscopy Analysis

    International Nuclear Information System (INIS)

    Roet, David; Van Espen, Piet

    2003-01-01

    Full Text: The interactions of keV electrons and photons with matter can be simulated accurately with the aid of the Monte Carlo (MC) technique. In scanning electron microscopy x-ray analysis (SEM-EDX) such simulations can be used to perform quantitative analysis using a Reverse Monte Carlo method even if the samples have irregular geometry. Alternatively the MC technique can generate spectra of standards for use in quantization with partial least squares regression. The feasibility of these alternatives to the more classical ZAF or phi-rho-Z quantification methods has been proven already. In order to be applicable for these purposes the MC-code needs to generate accurately only the characteristic K and L x-ray lines, but also the Bremsstrahlung continuum, i.e. the complete x-ray spectrum need to be simulated. Currently two types of MC simulation codes are available. Programs like Electron Flight Simulator and CASINO simulate characteristic x-rays due to electron interaction in a fast and efficient way but lack provision for the simulation of the continuum. On the other hand, programs like EGS4, MCNP4 and PENELOPE, originally developed for high energy (MeV- GeV) applications, are more complete but difficult to use and still slow, even on todays fastest computers. We therefore started the development of a dedicated MC simulation code for use in quantitative SEM-EDX work. The selection of the most appropriate cross section for the different interactions will be discussed and the results obtained will be compared with those obtained with existing MC programs. Examples of the application of MC simulations for quantitative analysis of samples with various composition will be given

  19. Justification of a Monte Carlo Algorithm for the Diffusion-Growth Simulation of Helium Clusters in Materials

    International Nuclear Information System (INIS)

    Yu-Lu, Zhou; Ai-Hong, Deng; Qing, Hou; Jun, Wang

    2009-01-01

    A theoretical analysis of a Monte Carlo (MC) method for the simulation of the diffusion-growth of helium clusters in materials is presented. This analysis is based on an assumption that the diffusion-growth process consists of first stage, during which the clusters diffuse freely, and second stage in which the coalescence occurs with certain probability. Since the accuracy of MC simulation results is sensitive to the coalescence probability, the MC calculations in the second stage is studied in detail. Firstly, the coalescence probability is analytically formulated for the one-dimensional diffusion-growth case. Thereafter, the one-dimensional results are employed to justify the MC simulation. The choice of time step and the random number generator used in the MC simulation are discussed

  20. A hybrid method for flood simulation in small catchments combining hydrodynamic and hydrological techniques

    Science.gov (United States)

    Bellos, Vasilis; Tsakiris, George

    2016-09-01

    The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.

  1. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    International Nuclear Information System (INIS)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X

    2015-01-01

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross

  2. SU-E-T-112: An OpenCL-Based Cross-Platform Monte Carlo Dose Engine (oclMC) for Coupled Photon-Electron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Shi, F; Folkerts, M; Qin, N; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2015-06-15

    Purpose: Low computational efficiency of Monte Carlo (MC) dose calculation impedes its clinical applications. Although a number of MC dose packages have been developed over the past few years, enabling fast MC dose calculations, most of these packages were developed under NVidia’s CUDA environment. This limited their code portability to other platforms, hindering the introduction of GPU-based MC dose engines to clinical practice. To solve this problem, we developed a cross-platform fast MC dose engine named oclMC under OpenCL environment for external photon and electron radiotherapy. Methods: Coupled photon-electron simulation was implemented with standard analogue simulation scheme for photon transport and Class II condensed history scheme for electron transport. We tested the accuracy and efficiency of oclMC by comparing the doses calculated using oclMC and gDPM, a previously developed GPU-based MC code on NVidia GPU platform, for a 15MeV electron beam and a 6MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. We also tested code portability of oclMC on different devices, including an NVidia GPU, two AMD GPUs and an Intel CPU. Results: Satisfactory agreements were observed in all photon and electron cases, with ∼0.48%–0.53% average dose differences at regions within 10% isodose line for electron beam cases and ∼0.15%–0.17% for photon beam cases. It took oclMC 3–4 sec to perform transport simulation for electron beam on NVidia Titan GPU and 35–51 sec for photon beam, both with ∼0.5% statistical uncertainty. The computation was 6%–17% slower than gDPM due to the differences in both physics model and development environment, which is considered not significant for clinical applications. In terms of code portability, gDPM only runs on NVidia GPUs, while oclMC successfully runs on all the tested devices. Conclusion: oclMC is an accurate and fast MC dose engine. Its high cross

  3. MC++ and a transport physics framework

    International Nuclear Information System (INIS)

    Lee, S.R.; Cummings, J.C.; Nolen, S.D.; Keen, N.D.

    1997-01-01

    The Department of Energy has launched the Accelerated Strategic Computing Initiative (ASCI) to address a pressing need for more comprehensive computer simulation capabilities in the area of nuclear weapons safety and reliability. In light of the decision by the US Government to abandon underground nuclear testing, the Science-Based Stockpile Stewardship (SBSS) program is focused on using computer modeling to assure the continued safety and effectiveness of the nuclear stockpile. The authors believe that the utilization of object-oriented design and programming techniques can help in this regard. Object-oriented programming (OOP) has become a popular model in the general software community for several reasons. MC++ is a specific ASCI-relevant application project which demonstrates the effectiveness of the object-oriented approach. It is a Monte Carlo neutron transport code written in C++. It is designed to be simple yet flexible, with the ability to quickly introduce new numerical algorithms or representations of the physics into the code. MC++ is easily ported to various types of Unix workstations and parallel computers such as the three new ASCI platforms, largely because it makes extensive use of classes from the Parallel Object-Oriented Methods and Applications (POOMA) C++ class library. The MC++ code has been successfully benchmarked using some simple physics test problems, has been shown to provide comparable serial performance and a parallel efficiency superior to that of a well-known Monte Carlo neutronics package written in Fortran, and was the first ASCI-relevant application to run in parallel on all three ASCI computing platforms

  4. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    International Nuclear Information System (INIS)

    Xu, Y; Tian, Z; Jiang, S; Jia, X; Zhou, L

    2015-01-01

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  5. SU-E-T-58: A Novel Monte Carlo Photon Transport Simulation Scheme and Its Application in Cone Beam CT Projection Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Y [UT Southwestern Medical Center, Dallas, TX (United States); Southern Medical University, Guangzhou (China); Tian, Z; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Zhou, L [Southern Medical University, Guangzhou (China)

    2015-06-15

    Purpose: Monte Carlo (MC) simulation is an important tool to solve radiotherapy and medical imaging problems. Low computational efficiency hinders its wide applications. Conventionally, MC is performed in a particle-by -particle fashion. The lack of control on particle trajectory is a main cause of low efficiency in some applications. Take cone beam CT (CBCT) projection simulation as an example, significant amount of computations were wasted on transporting photons that do not reach the detector. To solve this problem, we propose an innovative MC simulation scheme with a path-by-path sampling method. Methods: Consider a photon path starting at the x-ray source. After going through a set of interactions, it ends at the detector. In the proposed scheme, we sampled an entire photon path each time. Metropolis-Hasting algorithm was employed to accept/reject a sampled path based on a calculated acceptance probability, in order to maintain correct relative probabilities among different paths, which are governed by photon transport physics. We developed a package gMMC on GPU with this new scheme implemented. The performance of gMMC was tested in a sample problem of CBCT projection simulation for a homogeneous object. The results were compared to those obtained using gMCDRR, a GPU-based MC tool with the conventional particle-by-particle simulation scheme. Results: Calculated scattered photon signals in gMMC agreed with those from gMCDRR with a relative difference of 3%. It took 3.1 hr. for gMCDRR to simulate 7.8e11 photons and 246.5 sec for gMMC to simulate 1.4e10 paths. Under this setting, both results attained the same ∼2% statistical uncertainty. Hence, a speed-up factor of ∼45.3 was achieved by this new path-by-path simulation scheme, where all the computations were spent on those photons contributing to the detector signal. Conclusion: We innovatively proposed a novel path-by-path simulation scheme that enabled a significant efficiency enhancement for MC particle

  6. Multi-Scale Coupling Between Monte Carlo Molecular Simulation and Darcy-Scale Flow in Porous Media

    KAUST Repository

    Saad, Ahmed Mohamed

    2016-06-01

    In this work, an efficient coupling between Monte Carlo (MC) molecular simulation and Darcy-scale flow in porous media is presented. The cell centered finite difference method with non-uniform rectangular mesh were used to discretize the simulation domain and solve the governing equations. To speed up the MC simulations, we implemented a recently developed scheme that quickly generates MC Markov chains out of pre-computed ones, based on the reweighting and reconstruction algorithm. This method astonishingly reduces the required computational times by MC simulations from hours to seconds. To demonstrate the strength of the proposed coupling in terms of computational time efficiency and numerical accuracy in fluid properties, various numerical experiments covering different compressible single-phase flow scenarios were conducted. The novelty in the introduced scheme is in allowing an efficient coupling of the molecular scale and the Darcy\\'s one in reservoir simulators. This leads to an accurate description of thermodynamic behavior of the simulated reservoir fluids; consequently enhancing the confidence in the flow predictions in porous media.

  7. Experience in Collaboration: McDenver at McDonald's.

    Science.gov (United States)

    Combs, Clarice Sue

    2002-01-01

    The McDenver at McDonald's project provided a nontraditional, community-based teaching and learning environment for faculty and students in a health, physical education, and recreation (HPER) department and a school of nursing. Children and parents come to McDonald's, children received developmental screenings, and parents completed conferences…

  8. On the development of a comprehensive MC simulation model for the Gamma Knife Perfexion radiosurgery unit

    Science.gov (United States)

    Pappas, E. P.; Moutsatsos, A.; Pantelis, E.; Zoros, E.; Georgiou, E.; Torrens, M.; Karaiskos, P.

    2016-02-01

    This work presents a comprehensive Monte Carlo (MC) simulation model for the Gamma Knife Perfexion (PFX) radiosurgery unit. Model-based dosimetry calculations were benchmarked in terms of relative dose profiles (RDPs) and output factors (OFs), against corresponding EBT2 measurements. To reduce the rather prolonged computational time associated with the comprehensive PFX model MC simulations, two approximations were explored and evaluated on the grounds of dosimetric accuracy. The first consists in directional biasing of the 60Co photon emission while the second refers to the implementation of simplified source geometric models. The effect of the dose scoring volume dimensions in OF calculations accuracy was also explored. RDP calculations for the comprehensive PFX model were found to be in agreement with corresponding EBT2 measurements. Output factors of 0.819  ±  0.004 and 0.8941  ±  0.0013 were calculated for the 4 mm and 8 mm collimator, respectively, which agree, within uncertainties, with corresponding EBT2 measurements and published experimental data. Volume averaging was found to affect OF results by more than 0.3% for scoring volume radii greater than 0.5 mm and 1.4 mm for the 4 mm and 8 mm collimators, respectively. Directional biasing of photon emission resulted in a time efficiency gain factor of up to 210 with respect to the isotropic photon emission. Although no considerable effect on relative dose profiles was detected, directional biasing led to OF overestimations which were more pronounced for the 4 mm collimator and increased with decreasing emission cone half-angle, reaching up to 6% for a 5° angle. Implementation of simplified source models revealed that omitting the sources’ stainless steel capsule significantly affects both OF results and relative dose profiles, while the aluminum-based bushing did not exhibit considerable dosimetric effect. In conclusion, the results of this work suggest that any PFX

  9. MO-E-18C-02: Hands-On Monte Carlo Project Assignment as a Method to Teach Radiation Physics

    Energy Technology Data Exchange (ETDEWEB)

    Pater, P; Vallieres, M; Seuntjens, J [McGill University, Montreal, Quebec (Canada)

    2014-06-15

    Purpose: To present a hands-on project on Monte Carlo methods (MC) recently added to the curriculum and to discuss the students' appreciation. Methods: Since 2012, a 1.5 hour lecture dedicated to MC fundamentals follows the detailed presentation of photon and electron interactions. Students also program all sampling steps (interaction length and type, scattering angle, energy deposit) of a MC photon transport code. A handout structured in a step-by-step fashion guides student in conducting consistency checks. For extra points, students can code a fully working MC simulation, that simulates a dose distribution for 50 keV photons. A kerma approximation to dose deposition is assumed. A survey was conducted to which 10 out of the 14 attending students responded. It compared MC knowledge prior to and after the project, questioned the usefulness of radiation physics teaching through MC and surveyed possible project improvements. Results: According to the survey, 76% of students had no or a basic knowledge of MC methods before the class and 65% estimate to have a good to very good understanding of MC methods after attending the class. 80% of students feel that the MC project helped them significantly to understand simulations of dose distributions. On average, students dedicated 12.5 hours to the project and appreciated the balance between hand-holding and questions/implications. Conclusion: A lecture on MC methods with a hands-on MC programming project requiring about 14 hours was added to the graduate study curriculum since 2012. MC methods produce “gold standard” dose distributions and slowly enter routine clinical work and a fundamental understanding of MC methods should be a requirement for future students. Overall, the lecture and project helped students relate crosssections to dose depositions and presented numerical sampling methods behind the simulation of these dose distributions. Research funding from governments of Canada and Quebec. PP acknowledges

  10. MO-E-18C-02: Hands-On Monte Carlo Project Assignment as a Method to Teach Radiation Physics

    International Nuclear Information System (INIS)

    Pater, P; Vallieres, M; Seuntjens, J

    2014-01-01

    Purpose: To present a hands-on project on Monte Carlo methods (MC) recently added to the curriculum and to discuss the students' appreciation. Methods: Since 2012, a 1.5 hour lecture dedicated to MC fundamentals follows the detailed presentation of photon and electron interactions. Students also program all sampling steps (interaction length and type, scattering angle, energy deposit) of a MC photon transport code. A handout structured in a step-by-step fashion guides student in conducting consistency checks. For extra points, students can code a fully working MC simulation, that simulates a dose distribution for 50 keV photons. A kerma approximation to dose deposition is assumed. A survey was conducted to which 10 out of the 14 attending students responded. It compared MC knowledge prior to and after the project, questioned the usefulness of radiation physics teaching through MC and surveyed possible project improvements. Results: According to the survey, 76% of students had no or a basic knowledge of MC methods before the class and 65% estimate to have a good to very good understanding of MC methods after attending the class. 80% of students feel that the MC project helped them significantly to understand simulations of dose distributions. On average, students dedicated 12.5 hours to the project and appreciated the balance between hand-holding and questions/implications. Conclusion: A lecture on MC methods with a hands-on MC programming project requiring about 14 hours was added to the graduate study curriculum since 2012. MC methods produce “gold standard” dose distributions and slowly enter routine clinical work and a fundamental understanding of MC methods should be a requirement for future students. Overall, the lecture and project helped students relate crosssections to dose depositions and presented numerical sampling methods behind the simulation of these dose distributions. Research funding from governments of Canada and Quebec. PP acknowledges

  11. Fast CPU-based Monte Carlo simulation for radiotherapy dose calculation

    Science.gov (United States)

    Ziegenhein, Peter; Pirner, Sven; Kamerling, Cornelis Ph; Oelfke, Uwe

    2015-08-01

    Monte-Carlo (MC) simulations are considered to be the most accurate method for calculating dose distributions in radiotherapy. Its clinical application, however, still is limited by the long runtimes conventional implementations of MC algorithms require to deliver sufficiently accurate results on high resolution imaging data. In order to overcome this obstacle we developed the software-package PhiMC, which is capable of computing precise dose distributions in a sub-minute time-frame by leveraging the potential of modern many- and multi-core CPU-based computers. PhiMC is based on the well verified dose planning method (DPM). We could demonstrate that PhiMC delivers dose distributions which are in excellent agreement to DPM. The multi-core implementation of PhiMC scales well between different computer architectures and achieves a speed-up of up to 37× compared to the original DPM code executed on a modern system. Furthermore, we could show that our CPU-based implementation on a modern workstation is between 1.25× and 1.95× faster than a well-known GPU implementation of the same simulation method on a NVIDIA Tesla C2050. Since CPUs work on several hundreds of GB RAM the typical GPU memory limitation does not apply for our implementation and high resolution clinical plans can be calculated.

  12. Dynamic bounds coupled with Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Rajabalinejad, M., E-mail: M.Rajabalinejad@tudelft.n [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands); Meester, L.E. [Delft Institute of Applied Mathematics, Delft University of Technology, Delft (Netherlands); Gelder, P.H.A.J.M. van; Vrijling, J.K. [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands)

    2011-02-15

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper describes a method to reduce the simulation cost even further, while retaining the accuracy of Monte Carlo, by taking into account widely present monotonicity. For models exhibiting monotonic (decreasing or increasing) behavior, dynamic bounds (DB) are defined, which in a coupled Monte Carlo simulation are updated dynamically, resulting in a failure probability estimate, as well as a strict (non-probabilistic) upper and lower bounds. Accurate results are obtained at a much lower cost than an equivalent ordinary Monte Carlo simulation. In a two-dimensional and a four-dimensional numerical example, the cost reduction factors are 130 and 9, respectively, where the relative error is smaller than 5%. At higher accuracy levels, this factor increases, though this effect is expected to be smaller with increasing dimension. To show the application of DB method to real world problems, it is applied to a complex finite element model of a flood wall in New Orleans.

  13. Parallel Monte Carlo simulation of aerosol dynamics

    KAUST Repository

    Zhou, K.

    2014-01-01

    A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.

  14. Monte Carlo evaluation of scattering correction methods in 131I studies using pinhole collimator

    International Nuclear Information System (INIS)

    López Díaz, Adlin; San Pedro, Aley Palau; Martín Escuela, Juan Miguel; Rodríguez Pérez, Sunay; Díaz García, Angelina

    2017-01-01

    Scattering is quite important for image activity quantification. In order to study the scattering factors and the efficacy of 3 multiple window energy scatter correction methods during 131 I thyroid studies with a pinhole collimator (5 mm hole) a Monte Carlo simulation (MC) was developed. The GAMOS MC code was used to model the gamma camera and the thyroid source geometry. First, to validate the MC gamma camera pinhole-source model, sensibility in air and water of the simulated and measured thyroid phantom geometries were compared. Next, simulations to investigate scattering and the result of triple energy (TEW), Double energy (DW) and Reduced double (RDW) energy windows correction methods were performed for different thyroid sizes and depth thicknesses. The relative discrepancies to MC real event were evaluated. Results: The accuracy of the GAMOS MC model was verified and validated. The image’s scattering contribution was significant, between 27-40 %. The discrepancies between 3 multiple window energy correction method results were significant (between 9-86 %). The Reduce Double Window methods (15%) provide discrepancies of 9-16 %. Conclusions: For the simulated thyroid geometry with pinhole, the RDW (15 %) was the most effective. (author)

  15. A non-linear and stochastic response surface method for Bayesian estimation of uncertainty in soil moisture simulation from a land surface model

    Directory of Open Access Journals (Sweden)

    F. Hossain

    2004-01-01

    Full Text Available This study presents a simple and efficient scheme for Bayesian estimation of uncertainty in soil moisture simulation by a Land Surface Model (LSM. The scheme is assessed within a Monte Carlo (MC simulation framework based on the Generalized Likelihood Uncertainty Estimation (GLUE methodology. A primary limitation of using the GLUE method is the prohibitive computational burden imposed by uniform random sampling of the model's parameter distributions. Sampling is improved in the proposed scheme by stochastic modeling of the parameters' response surface that recognizes the non-linear deterministic behavior between soil moisture and land surface parameters. Uncertainty in soil moisture simulation (model output is approximated through a Hermite polynomial chaos expansion of normal random variables that represent the model's parameter (model input uncertainty. The unknown coefficients of the polynomial are calculated using limited number of model simulation runs. The calibrated polynomial is then used as a fast-running proxy to the slower-running LSM to predict the degree of representativeness of a randomly sampled model parameter set. An evaluation of the scheme's efficiency in sampling is made through comparison with the fully random MC sampling (the norm for GLUE and the nearest-neighborhood sampling technique. The scheme was able to reduce computational burden of random MC sampling for GLUE in the ranges of 10%-70%. The scheme was also found to be about 10% more efficient than the nearest-neighborhood sampling method in predicting a sampled parameter set's degree of representativeness. The GLUE based on the proposed sampling scheme did not alter the essential features of the uncertainty structure in soil moisture simulation. The scheme can potentially make GLUE uncertainty estimation for any LSM more efficient as it does not impose any additional structural or distributional assumptions.

  16. Qualification of McCARD/MASTER Code System for Yonggwang Unit 4

    International Nuclear Information System (INIS)

    Park, Ho Jin; Shim, Hyung Jin; Joo, Han Gyu; Kim, Chang Hyo

    2011-01-01

    Recently, we have developed the new two-step procedure based on the Monte Carlo (MC) methods. In this procedure, one can generate the few group constants including the few-group diffusion constants by the MC method augmented by the critical spectrum, which is provided by the solution to the homogeneous 0-dimensional B1 equation. In order to examine the qualification of the few-group constants generated by MC method, we combine MASTER with McCARD to form McCARD/MASTER code system for two-step core neutronics calculations. In the fictitious PWR system problems, the core design parameters calculated by the two-step McCARD/MASTER analysis agree well with those from the direct MC calculations. In this paper, a neutronic design analysis for the initial core of Yonggwang Nuclear Unit 4 (YGN4) is conducted using McCARD/MASTER two-step procedure to examine the qualification of two group constants from McCARD in terms of a real PWR core problem. To compare with the results, the nuclear design report and measured data are chosen as the reference solutions

  17. Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær

    2013-01-01

    . The generation and moderation of neutrons is simulated using a full scale MCNPX model of the ESS target monolith. Upon entering the beam extraction region, the individual neutron states are handed to McStas via the MCNPX-McStas interface. McStas transports the neutrons through the beam guide and by using newly......Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed[1]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides...... developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose rates...

  18. Comparison of McMaster and FECPAKG2 methods for counting nematode eggs in the faeces of alpacas.

    Science.gov (United States)

    Rashid, Mohammed H; Stevenson, Mark A; Waenga, Shea; Mirams, Greg; Campbell, Angus J D; Vaughan, Jane L; Jabbar, Abdul

    2018-05-02

    This study aimed to compare the FECPAK G2 and the McMaster techniques for counting of gastrointestinal nematode eggs in the faeces of alpacas using two floatation solutions (saturated sodium chloride and sucrose solutions). Faecal eggs counts from both techniques were compared using the Lin's concordance correlation coefficient and Bland and Altman statistics. Results showed moderate to good agreement between the two methods, with better agreement achieved when saturated sugar is used as a floatation fluid, particularly when faecal egg counts are less than 1000 eggs per gram of faeces. To the best of our knowledge this is the first study to assess agreement of measurements between McMaster and FECPAK G2 methods for estimating faecal eggs in South American camelids.

  19. Using McStas for modelling complex optics, using simple building bricks

    International Nuclear Information System (INIS)

    Willendrup, Peter K.; Udby, Linda; Knudsen, Erik; Farhi, Emmanuel; Lefmann, Kim

    2011-01-01

    The McStas neutron ray-tracing simulation package is a versatile tool for producing accurate neutron simulations, extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. In McStas, component organization and simulation flow is intrinsically linear: the neutron interacts with the beamline components in a sequential order, one by one. Historically, a beamline component with several parts had to be implemented with a complete, internal description of all these parts, e.g. a guide component including all four mirror plates and required logic to allow scattering between the mirrors. For quite a while, users have requested the ability to allow 'components inside components' or meta-components, allowing to combine functionality of several simple components to achieve more complex behaviour, i.e. four single mirror plates together defining a guide. We will here show that it is now possible to define meta-components in McStas, and present a set of detailed, validated examples including a guide with an embedded, wedged, polarizing mirror system of the Helmholtz-Zentrum Berlin type.

  20. The effect of glycerin solution density and viscosity on vibration amplitude of oblique different piezoelectric MC near the surface in 3D modeling

    Science.gov (United States)

    Korayem, A. H.; Abdi, M.; Korayem, M. H.

    2018-06-01

    The surface topography in nanoscale is one of the most important applications of AFM. The analysis of piezoelectric microcantilevers vibration behavior is essential to improve the AFM performance. To this end, one of the appropriate methods to simulate the dynamic behavior of microcantilever (MC) is a numerical solution with FEM in the 3D modeling using COMSOL software. The present study aims to simulate different geometries of the four-layered AFM piezoelectric MCs in 2D and 3D modeling in a liquid medium using COMSOL software. The 3D simulation was done in a spherical container using FSI domain in COMSOL. In 2D modeling by applying Hamilton's Principle based on Euler-Bernoulli Beam theory, the governing motion equation was derived and discretized with FEM. In this mode, the hydrodynamic force was assumed with a string of spheres. The effect of this force along with the squeezed-film force was considered on MC equations. The effect of fluid density and viscosity on the MC vibrations that immersed in different glycerin solutions was investigated in 2D and 3D modes and the results were compared with the experimental results. The frequencies and time responses of MC close to the surface were obtained considering tip-sample forces. The surface topography of MCs different geometries were compared in the liquid medium and the comparison was done in both tapping and non-contact mode. Various types of surface roughness were considered in the topography for MC different geometries. Also, the effect of geometric dimensions on the surface topography was investigated. In liquid medium, MC is installed at an oblique position to avoid damaging the MC due to the squeezed-film force in the vicinity of MC surface. Finally, the effect of MC's angle on surface topography and time response of the system was investigated.

  1. One-dimensional simulation of stratification and dissolved oxygen in McCook Reservoir, Illinois

    Science.gov (United States)

    Robertson, Dale M.

    2000-01-01

    As part of the Chicagoland Underflow Plan/Tunnel and Reservoir Plan, the U.S. Army Corps of Engineers, Chicago District, plans to build McCook Reservoir.a flood-control reservoir to store combined stormwater and raw sewage (combined sewage). To prevent the combined sewage in the reservoir from becoming anoxic and producing hydrogen sulfide gas, a coarse-bubble aeration system will be designed and installed on the basis of results from CUP 0-D, a zero-dimensional model, and MAC3D, a three-dimensional model. Two inherent assumptions in the application of MAC3D are that density stratification in the simulated water body is minimal or not present and that surface heat transfers are unimportant and, therefore, may be neglected. To test these assumptions, the previously tested, one-dimensional Dynamic Lake Model (DLM) was used to simulate changes in temperature and dissolved oxygen in the reservoir after a 1-in-100-year event. Results from model simulations indicate that the assumptions made in MAC3D application are valid as long as the aeration system, with an air-flow rate of 1.2 cubic meters per second or more, is operated while the combined sewage is stored in the reservoir. Results also indicate that the high biochemical oxygen demand of the combined sewage will quickly consume the dissolved oxygen stored in the reservoir and the dissolved oxygen transferred through the surface of the reservoir; therefore, oxygen must be supplied by either the rising bubbles of the aeration system (a process not incorporated in DLM) or some other technique to prevent anoxia.

  2. Deviation from equilibrium conditions in molecular dynamic simulations of homogeneous nucleation.

    Science.gov (United States)

    Halonen, Roope; Zapadinsky, Evgeni; Vehkamäki, Hanna

    2018-04-28

    We present a comparison between Monte Carlo (MC) results for homogeneous vapour-liquid nucleation of Lennard-Jones clusters and previously published values from molecular dynamics (MD) simulations. Both the MC and MD methods sample real cluster configuration distributions. In the MD simulations, the extent of the temperature fluctuation is usually controlled with an artificial thermostat rather than with more realistic carrier gas. In this study, not only a primarily velocity scaling thermostat is considered, but also Nosé-Hoover, Berendsen, and stochastic Langevin thermostat methods are covered. The nucleation rates based on a kinetic scheme and the canonical MC calculation serve as a point of reference since they by definition describe an equilibrated system. The studied temperature range is from T = 0.3 to 0.65 ϵ/k. The kinetic scheme reproduces well the isothermal nucleation rates obtained by Wedekind et al. [J. Chem. Phys. 127, 064501 (2007)] using MD simulations with carrier gas. The nucleation rates obtained by artificially thermostatted MD simulations are consistently lower than the reference nucleation rates based on MC calculations. The discrepancy increases up to several orders of magnitude when the density of the nucleating vapour decreases. At low temperatures, the difference to the MC-based reference nucleation rates in some cases exceeds the maximal nonisothermal effect predicted by classical theory of Feder et al. [Adv. Phys. 15, 111 (1966)].

  3. EVALUATING CHAMBERLAIN'S, McGREGOR'S, AND McRAE'S ...

    African Journals Online (AJOL)

    2012-08-08

    Aug 8, 2012 ... spine and base of skull radiographs which however have diagnostic challenges due to the complexity of the ... McGregor's and Mc Rae's using CT bone windows ... metastatic lesion were excluded from the study. RESULTS.

  4. The effect of linear spring number at side load of McPherson suspension in electric city car

    Science.gov (United States)

    Budi, Sigit Setijo; Suprihadi, Agus; Makhrojan, Agus; Ismail, Rifky; Jamari, J.

    2017-01-01

    The function of the spring suspension on Mc Pherson type is to control vehicle stability and increase ride convenience although having tendencies of side load presence. The purpose of this study is to obtain simulation results of Mc Pherson suspension spring in the electric city car by using the finite element method and determining the side load that appears on the spring suspension. This research is conducted in several stages; they are linear spring designing models with various spring coil and spring suspension modeling using FEM software. Suspension spring is compressed in the vertical direction (z-axis) and at the upper part of the suspension springs will be seen the force that arises towards the x, y, and z-axis to simulate the side load arising on the upper part of the spring. The results of FEM simulation that the side load on the spring toward the x and y-axis which the value gets close to zero is the most stable spring.

  5. Development of a polarized neutron beam line at Algerian research reactors using McStas software

    Science.gov (United States)

    Makhloufi, M.; Salah, H.

    2017-02-01

    Unpolarized instrumentation has long been studied and designed using McStas simulation tool. But, only recently new models were developed for McStas to simulate polarized neutron scattering instruments. In the present contribution, we used McStas software to design a polarized neutron beam line, taking advantage of the available spectrometers reflectometer and diffractometer in Algeria. Both thermal and cold neutron was considered. The polarization was made by two types of supermirrors polarizers FeSi and CoCu provided by the HZB institute. For sake of performance and comparison, the polarizers were characterized and their characteristics reproduced. The simulated instruments are reported. Flipper and electromagnets for guide field are developed. Further developments including analyzers and upgrading of the existing spectrometers are underway.

  6. SCIENTIFIC PROGRESS OF THE MC-PAD NETWORK

    CERN Document Server

    Aguilar, J; Ambalathankandy, P; Apostolakis, J; Arora, R; Balog, T; Behnke, T; Beltrame, P; Bencivenni, G; Caiazza, S; Dong, J; Heller, M; Heuser, J; Idzik, M; Joram, C; Klanner, R; Koffeman, E; Korpar, S; Kramberger, G; Lohmann, W; Milovanović, M; Miscetti, S; Moll, M; Novgorodova, O; Pacifico, N; Pirvutoiu, C; Radu, R; Rahman, S; Rohe, T; Ropelewski, L; Roukoutakis, F; Schmidt, C; Schön, R; Sibille, J; Tsagri, M; Turala, M; Van Beuzekom, M; Verheyden, R; Villa, M; Zappon, F; Zawiejski, L; Zhang, J

    2013-01-01

    MC-PAD is a multi-site Initial Training Network on particle detectors in physics experiments. It comprises nine academic participants, three industrial partners and two associated academic partners. 17 recruited Early Stage and 5 Experienced Researchers have performed their scientific work in the network. The research and development work of MC-PAD is organized in 12 work packages, which focus on a large variety of aspects of particle detector development, electronics as well as simulation and modelling. The network was established in November 2008 and lasted until October 2012 (48 months). This report describes the R&D activities and highlights the main results achieved during this period.

  7. New features in McStas, version 1.5

    DEFF Research Database (Denmark)

    Åstrand, P.O.; Lefmann, K.; Farhi, E.

    2002-01-01

    The neutron ray-tracing simulation package McStas has attracted numerous users, and the development of the package continues with version 1.5 released at the ICNS 2001 conference. New features include: support for neutron polarisation, labelling of neutrons, realistic source and sample components......, and interface to the Riso instrument-control software TASCOM. We give a general introduction to McStas and present the latest developments. In particular, we give an example of how the neutron-label option has been used to locate the origin of a spurious side-peak, observed in an experiment with RITA-1 at Riso....

  8. New features in McStas, version 1.5

    International Nuclear Information System (INIS)

    Aastrand, P.O.; Lefmann, K.; Nielsen, K.; Skaarup, P.; Farhi, E.

    2002-01-01

    The neutron ray-tracing simulation package McStas has attracted numerous users, and the development of the package continues with version 1.5 released at the ICNS 2001 conference. New features include: support for neutron polarisation, labelling of neutrons, realistic source and sample components, and interface to the Riso instrument-control software TASCOM. We give a general introduction to McStas and present the latest developments. In particular, we give an example of how the neutron-label option has been used to locate the origin of a spurious side-peak, observed in an experiment with RITA-1 at Riso. (orig.)

  9. Novel applications of the x-ray tracing software package McXtrace

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Nielsen, Martin Meedom; Haldrup, Kristoffer

    2014-01-01

    We will present examples of applying the X-ray tracing software package McXtrace to different kinds of X-ray scattering experiments. In particular we will be focusing on time-resolved type experiments. Simulations of full scale experiments are particularly useful for this kind, especially when...... some of the issues encountered. Generally more than one or all of these effects are present at once. Simulations can in these cases be used to identify distinct footprints of such distortions and thus give the experimenter a means of deconvoluting them from the signal. We will present a study...... of this kind along with the newest developments of the McXtrace software package....

  10. Safe bunker designing for the 18 MV Varian 2100 Clinac: a comparison between Monte Carlo simulation based upon data and new protocol recommendations.

    Science.gov (United States)

    Beigi, Manije; Afarande, Fatemeh; Ghiasi, Hosein

    2016-01-01

    The aim of this study was to compare two bunkers designed by only protocols recommendations and Monte Carlo (MC) based upon data derived for an 18 MV Varian 2100Clinac accelerator. High energy radiation therapy is associated with fast and thermal photoneutrons. Adequate shielding against the contaminant neutron has been recommended by IAEA and NCRP new protocols. The latest protocols released by the IAEA (safety report No. 47) and NCRP report No. 151 were used for the bunker designing calculations. MC method based upon data was also derived. Two bunkers using protocols and MC upon data were designed and discussed. From designed door's thickness, the door designed by the MC simulation and Wu-McGinley analytical method was closer in both BPE and lead thickness. In the case of the primary and secondary barriers, MC simulation resulted in 440.11 mm for the ordinary concrete, total concrete thickness of 1709 mm was required. Calculating the same parameters value with the recommended analytical methods resulted in 1762 mm for the required thickness using 445 mm as recommended by TVL for the concrete. Additionally, for the secondary barrier the thickness of 752.05 mm was obtained. Our results showed MC simulation and the followed protocols recommendations in dose calculation are in good agreement in the radiation contamination dose calculation. Difference between the two analytical and MC simulation methods revealed that the application of only one method for the bunker design may lead to underestimation or overestimation in dose and shielding calculations.

  11. Study on Photon Transport Problem Based on the Platform of Molecular Optical Simulation Environment

    Directory of Open Access Journals (Sweden)

    Kuan Peng

    2010-01-01

    Full Text Available As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SPn, and physical measurement to verify the performance of our study method on both accuracy and efficiency.

  12. Study on photon transport problem based on the platform of molecular optical simulation environment.

    Science.gov (United States)

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (SP(n)), and physical measurement to verify the performance of our study method on both accuracy and efficiency.

  13. A Proposal of New Spherical Particle Modeling Method Based on Stochastic Sampling of Particle Locations in Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Do Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jea Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    To the high computational efficiency and user convenience, the implicit method had received attention; however, it is noted that the implicit method in the previous studies has low accuracy at high packing fraction. In this study, a new implicit method, which can be used at any packing fraction with high accuracy, is proposed. In this study, the implicit modeling method in the spherical particle distributed medium for using the MC simulation is proposed. A new concept in the spherical particle sampling was developed to solve the problems in the previous implicit methods. The sampling method was verified by simulating the sampling method in the infinite and finite medium. The results show that the particle implicit modeling with the proposed method was accurately performed in all packing fraction boundaries. It is expected that the proposed method can be efficiently utilized for the spherical particle distributed mediums, which are the fusion reactor blanket, VHTR reactors, and shielding analysis.

  14. A method to generate equivalent energy spectra and filtration models based on measurement for multidetector CT Monte Carlo dosimetry simulations

    International Nuclear Information System (INIS)

    Turner, Adam C.; Zhang Di; Kim, Hyun J.; DeMarco, John J.; Cagnon, Chris H.; Angel, Erin; Cody, Dianna D.; Stevens, Donna M.; Primak, Andrew N.; McCollough, Cynthia H.; McNitt-Gray, Michael F.

    2009-01-01

    The purpose of this study was to present a method for generating x-ray source models for performing Monte Carlo (MC) radiation dosimetry simulations of multidetector row CT (MDCT) scanners. These so-called ''equivalent'' source models consist of an energy spectrum and filtration description that are generated based wholly on the measured values and can be used in place of proprietary manufacturer's data for scanner-specific MDCT MC simulations. Required measurements include the half value layers (HVL 1 and HVL 2 ) and the bowtie profile (exposure values across the fan beam) for the MDCT scanner of interest. Using these measured values, a method was described (a) to numerically construct a spectrum with the calculated HVLs approximately equal to those measured (equivalent spectrum) and then (b) to determine a filtration scheme (equivalent filter) that attenuates the equivalent spectrum in a similar fashion as the actual filtration attenuates the actual x-ray beam, as measured by the bowtie profile measurements. Using this method, two types of equivalent source models were generated: One using a spectrum based on both HVL 1 and HVL 2 measurements and its corresponding filtration scheme and the second consisting of a spectrum based only on the measured HVL 1 and its corresponding filtration scheme. Finally, a third type of source model was built based on the spectrum and filtration data provided by the scanner's manufacturer. MC simulations using each of these three source model types were evaluated by comparing the accuracy of multiple CT dose index (CTDI) simulations to measured CTDI values for 64-slice scanners from the four major MDCT manufacturers. Comprehensive evaluations were carried out for each scanner using each kVp and bowtie filter combination available. CTDI experiments were performed for both head (16 cm in diameter) and body (32 cm in diameter) CTDI phantoms using both central and peripheral measurement positions. Both equivalent source model types

  15. A simulation model of IT risk on program trading

    Science.gov (United States)

    Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan

    2015-12-01

    The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.

  16. Multilevel Monte Carlo methods using ensemble level mixed MsFEM for two-phase flow and transport simulations

    KAUST Repository

    Efendiev, Yalchin R.

    2013-08-21

    (and expensive) forward simulations are run with fewer samples, while less accurate (and inexpensive) forward simulations are run with a larger number of samples. Selecting the number of expensive and inexpensive simulations based on the number of coarse degrees of freedom, one can show that MLMC methods can provide better accuracy at the same cost as Monte Carlo (MC) methods. The main objective of the paper is twofold. First, we would like to compare NLSO and LSO mixed MsFEMs. Further, we use both approaches in the context of MLMC to speedup MC calculations. © 2013 Springer Science+Business Media Dordrecht.

  17. Probability-neighbor method of accelerating geometry treatment in reactor Monte Carlo code RMC

    International Nuclear Information System (INIS)

    She, Ding; Li, Zeguang; Xu, Qi; Wang, Kan; Yu, Ganglin

    2011-01-01

    Probability neighbor method (PNM) is proposed in this paper to accelerate geometry treatment of Monte Carlo (MC) simulation and validated in self-developed reactor Monte Carlo code RMC. During MC simulation by either ray-tracking or delta-tracking method, large amounts of time are spent in finding out which cell one particle is located in. The traditional way is to search cells one by one with certain sequence defined previously. However, this procedure becomes very time-consuming when the system contains a large number of cells. Considering that particles have different probability to enter different cells, PNM method optimizes the searching sequence, i.e., the cells with larger probability are searched preferentially. The PNM method is implemented in RMC code and the numerical results show that the considerable time of geometry treatment in MC calculation for complicated systems is saved, especially effective in delta-tracking simulation. (author)

  18. MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes

    Science.gov (United States)

    Fonseca, T. C. F.; Mendes, B. M.; Lacerda, M. A. S.; Silva, L. A. C.; Paixão, L.; Bastos, F. M.; Ramirez, J. V.; Junior, J. P. R.

    2017-11-01

    The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm2. This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results.

  19. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies

    Energy Technology Data Exchange (ETDEWEB)

    Häggström, Ida, E-mail: haeggsti@mskcc.org [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 and Department of Radiation Sciences, Umeå University, Umeå 90187 (Sweden); Beattie, Bradley J.; Schmidtlein, C. Ross [Department of Medical Physics, Memorial Sloan Kettering Cancer Center, New York, New York 10065 (United States)

    2016-06-15

    Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for

  20. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies

    International Nuclear Information System (INIS)

    Häggström, Ida; Beattie, Bradley J.; Schmidtlein, C. Ross

    2016-01-01

    Purpose: To develop and evaluate a fast and simple tool called dPETSTEP (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. Methods: The tool was developed in MATLAB using both new and previously reported modules of PETSTEP (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). Results: dPETSTEP was 8000 times faster than MC. Dynamic images from dPETSTEP had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dPETSTEP and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dPETSTEP images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p < 0.01). Compared to GAUSS, dPETSTEP images and noise properties agreed better with MC. Conclusions: The authors have developed a fast and easy one-stop solution for simulations of dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dPETSTEP to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for

  1. McClean Lake. Site Guide

    International Nuclear Information System (INIS)

    2016-09-01

    Located over 700 kilometers northeast of Saskatoon, Areva's McClean Lake site is comprised of several uranium mines and one of the most technologically advanced uranium mills in the world - the only mill designed to process high-grade uranium ore without dilution. Areva has operated several open-pit uranium mines at the McClean Lake site, and is evaluating future mines at and near the site. The McClean Lake mill has recently undergone a multimillion-dollar upgrade and expansion, which has doubled its annual production capacity of uranium concentrate to 24 million pounds. It is the only facility in the world capable of processing high-grade uranium ore without diluting it. The mill processes the ore from the Cigar Lake mine, the world's second largest and highest-grade uranium mine. The McClean Lake site operates 365 days a year on a week-in/week-out rotation schedule for workers, over 50% of whom reside in northern Saskatchewan communities. Tailings are waste products resulting from milling uranium ore. This waste is made up of leach residue solids, waste solutions and chemical precipitates that are carefully engineered for long-term disposal. The TMF serves as the repository for all resulting tailings. This facility allows proper waste management, which minimizes potential adverse environmental effects. Mining projections indicate that the McClean Lake mill will produce tailings in excess of the existing capacity of the TMF. After evaluating a number of options, Areva has decided to pursue an expansion of this facility. Areva is developing the Surface Access Borehole Resource Extraction (SABRE) mining method, which uses a high-pressure water jet placed at the bottom of the drill hole to extract ore. Areva has conducted a series of tests with this method and is evaluating its potential for future mining operations. McClean Lake maintains its certification in ISO 14001 standards for environmental management and OHSAS 18001 standards for occupational health

  2. Three dimensional electrochemical simulation of solid oxide fuel cell cathode based on microstructure reconstructed by marching cubes method

    Science.gov (United States)

    He, An; Gong, Jiaming; Shikazono, Naoki

    2018-05-01

    In the present study, a model is introduced to correlate the electrochemical performance of solid oxide fuel cell (SOFC) with the 3D microstructure reconstructed by focused ion beam scanning electron microscopy (FIB-SEM) in which the solid surface is modeled by the marching cubes (MC) method. Lattice Boltzmann method (LBM) is used to solve the governing equations. In order to maintain the geometries reconstructed by the MC method, local effective diffusivities and conductivities computed based on the MC geometries are applied in each grid, and partial bounce-back scheme is applied according to the boundary predicted by the MC method. From the tortuosity factor and overpotential calculation results, it is concluded that the MC geometry drastically improves the computational accuracy by giving more precise topology information.

  3. Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

    CERN Document Server

    The ENVISION Collaboration

    2014-01-01

    Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

  4. Study on Photon Transport Problem Based on the Platform of Molecular Optical Simulation Environment

    Science.gov (United States)

    Peng, Kuan; Gao, Xinbo; Liang, Jimin; Qu, Xiaochao; Ren, Nunu; Chen, Xueli; Ma, Bin; Tian, Jie

    2010-01-01

    As an important molecular imaging modality, optical imaging has attracted increasing attention in the recent years. Since the physical experiment is usually complicated and expensive, research methods based on simulation platforms have obtained extensive attention. We developed a simulation platform named Molecular Optical Simulation Environment (MOSE) to simulate photon transport in both biological tissues and free space for optical imaging based on noncontact measurement. In this platform, Monte Carlo (MC) method and the hybrid radiosity-radiance theorem are used to simulate photon transport in biological tissues and free space, respectively, so both contact and noncontact measurement modes of optical imaging can be simulated properly. In addition, a parallelization strategy for MC method is employed to improve the computational efficiency. In this paper, we study the photon transport problems in both biological tissues and free space using MOSE. The results are compared with Tracepro, simplified spherical harmonics method (S P n), and physical measurement to verify the performance of our study method on both accuracy and efficiency. PMID:20445737

  5. Simulation for developing new pulse neutron spectrometers I. Creation of new McStas components of moderators of JSNS

    CERN Document Server

    Tamura, I; Arai, M; Harada, M; Maekawa, F; Shibata, K; Soyama, K

    2003-01-01

    Moderators components of the McStas code have been created for the design of JSNS instruments. Three cryogenic moderators are adopted in JSNS, one is coupled H sub 2 moderators for high intensity experiments and other two are decoupled H sub 2 with poisoned or unpoisoned for high resolution moderators. Since the characteristics of neutron beams generated from moderators make influence on the performance of pulse neutron spectrometers, it is important to perform the Monte Carlo simulation with neutron source component written precisely. The neutron spectrum and time structure were calculated using NMTC/JAERI97 and MCNP4a codes. The simulation parameters, which describe the pulse shape over entire spectrum as a function of time, are optimized. In this paper, the creation of neutron source components for port No.16 viewed to coupled H sub 2 moderator and for port No.11 viewed to decoupled H sub 2 moderator of JSNS are reported.

  6. Ultrafast cone-beam CT scatter correction with GPU-based Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    Yuan Xu

    2014-03-01

    Full Text Available Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT. We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstruction within 30 seconds.Methods: The method consists of six steps: 1 FDK reconstruction using raw projection data; 2 Rigid Registration of planning CT to the FDK results; 3 MC scatter calculation at sparse view angles using the planning CT; 4 Interpolation of the calculated scatter signals to other angles; 5 Removal of scatter from the raw projections; 6 FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC noise from the simulated scatter images caused by low photon numbers. The method is validated on one simulated head-and-neck case with 364 projection angles.Results: We have examined variation of the scatter signal among projection angles using Fourier analysis. It is found that scatter images at 31 angles are sufficient to restore those at all angles with < 0.1% error. For the simulated patient case with a resolution of 512 × 512 × 100, we simulated 5 × 106 photons per angle. The total computation time is 20.52 seconds on a Nvidia GTX Titan GPU, and the time at each step is 2.53, 0.64, 14.78, 0.13, 0.19, and 2.25 seconds, respectively. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU.Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. It accomplished the whole procedure of scatter correction and reconstruction within 30 seconds.----------------------------Cite this

  7. A Randomized Controlled Trial Comparing the McKenzie Method to Motor Control Exercises in People With Chronic Low Back Pain and a Directional Preference.

    Science.gov (United States)

    Halliday, Mark H; Pappas, Evangelos; Hancock, Mark J; Clare, Helen A; Pinto, Rafael Z; Robertson, Gavin; Ferreira, Paulo H

    2016-07-01

    Study Design Randomized clinical trial. Background Motor control exercises are believed to improve coordination of the trunk muscles. It is unclear whether increases in trunk muscle thickness can be facilitated by approaches such as the McKenzie method. Furthermore, it is unclear which approach may have superior clinical outcomes. Objectives The primary aim was to compare the effects of the McKenzie method and motor control exercises on trunk muscle recruitment in people with chronic low back pain classified with a directional preference. The secondary aim was to conduct a between-group comparison of outcomes for pain, function, and global perceived effect. Methods Seventy people with chronic low back pain who demonstrated a directional preference using the McKenzie assessment were randomized to receive 12 treatments over 8 weeks with the McKenzie method or with motor control approaches. All outcomes were collected at baseline and at 8-week follow-up by blinded assessors. Results No significant between-group difference was found for trunk muscle thickness of the transversus abdominis (-5.8%; 95% confidence interval [CI]: -15.2%, 3.7%), obliquus internus (-0.7%; 95% CI: -6.6%, 5.2%), and obliquus externus (1.2%; 95% CI: -4.3%, 6.8%). Perceived recovery was slightly superior in the McKenzie group (-0.8; 95% CI: -1.5, -0.1) on a -5 to +5 scale. No significant between-group differences were found for pain or function (P = .99 and P = .26, respectively). Conclusion We found no significant effect of treatment group for trunk muscle thickness. Participants reported a slightly greater sense of perceived recovery with the McKenzie method than with the motor control approach. Level of Evidence Therapy, level 1b-. Registered September 7, 2011 at www.anzctr.org.au (ACTRN12611000971932). J Orthop Sports Phys Ther 2016;46(7):514-522. Epub 12 May 2016. doi:10.2519/jospt.2016.6379.

  8. Multi-scale Modeling of Compressible Single-phase Flow in Porous Media using Molecular Simulation

    KAUST Repository

    Saad, Ahmed Mohamed

    2016-05-01

    In this study, an efficient coupling between Monte Carlo (MC) molecular simulation and Darcy-scale flow in porous media is presented. The cell-centered finite difference method with a non-uniform rectangular mesh were used to discretize the simulation domain and solve the governing equations. To speed up the MC simulations, we implemented a recently developed scheme that quickly generates MC Markov chains out of pre-computed ones, based on the reweighting and reconstruction algorithm. This method astonishingly reduces the required computational time by MC simulations from hours to seconds. In addition, the reweighting and reconstruction scheme, which was originally designed to work with the LJ potential model, is extended to work with a potential model that accounts for the molecular quadrupole moment of fluids with non-spherical molecules such as CO2. The potential model was used to simulate the thermodynamic equilibrium properties for single-phase and two-phase systems using the canonical ensemble and the Gibbs ensemble, respectively. Comparing the simulation results with the experimental data showed that the implemented model has an excellent fit outperforming the standard LJ model. To demonstrate the strength of the proposed coupling in terms of computational time efficiency and numerical accuracy in fluid properties, various numerical experiments covering different compressible single-phase flow scenarios were conducted. The novelty in the introduced scheme is in allowing an efficient coupling of the molecular scale and Darcy scale in reservoir simulators. This leads to an accurate description of the thermodynamic behavior of the simulated reservoir fluids; consequently enhancing the confidence in the flow predictions in porous media.

  9. A comparison of modifications of the McMaster method for the enumeration of Ascaris suum eggs in pig faecal samples.

    Science.gov (United States)

    Pereckiene, A; Kaziūnaite, V; Vysniauskas, A; Petkevicius, S; Malakauskas, A; Sarkūnas, M; Taylor, M A

    2007-10-21

    The comparative efficacies of seven published McMaster method modifications for faecal egg counting were evaluated on pig faecal samples containing Ascaris suum eggs. Comparisons were made as to the number of samples found to be positive by each of the methods, the total egg counts per gram (EPG) of faeces, the variations in EPG obtained in the samples examined, and the ease of use of each of the methods. Each method was evaluated after the examination of 30 samples of faeces. The positive samples were identified by counting A. suum eggs in one, two and three sections of newly designed McMaster chamber. In the present study compared methods were reported by: I-Henriksen and Aagaard [Henriksen, S.A., Aagaard, K.A., 1976. A simple flotation and McMaster method. Nord. Vet. Med. 28, 392-397]; II-Kassai [Kassai, T., 1999. Veterinary Helminthology. Butterworth-Heinemann, Oxford, 260 pp.]; III and IV-Urquhart et al. [Urquhart, G.M., Armour, J., Duncan, J.L., Dunn, A.M., Jennings, F.W., 1996. Veterinary Parasitology, 2nd ed. Blackwell Science Ltd., Oxford, UK, 307 pp.] (centrifugation and non-centrifugation methods); V and VI-Grønvold [Grønvold, J., 1991. Laboratory diagnoses of helminths common routine methods used in Denmark. In: Nansen, P., Grønvold, J., Bjørn, H. (Eds.), Seminars on Parasitic Problems in Farm Animals Related to Fodder Production and Management. The Estonian Academy of Sciences, Tartu, Estonia, pp. 47-48] (salt solution, and salt and glucose solution); VII-Thienpont et al. [Thienpont, D., Rochette, F., Vanparijs, O.F.J., 1986. Diagnosing Helminthiasis by Coprological Examination. Coprological Examination, 2nd ed. Janssen Research Foundation, Beerse, Belgium, 205 pp.]. The number of positive samples by examining single section ranged from 98.9% (method I), to 51.1% (method VII). Only with methods I and II, there was a 100% positivity in two out of three of the chambers examined, and FEC obtained using these methods were significantly (pcoefficient

  10. A virtual source model for Monte Carlo simulation of helical tomotherapy.

    Science.gov (United States)

    Yuan, Jiankui; Rong, Yi; Chen, Quan

    2015-01-08

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase-space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS-generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of < 1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of < 2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM-based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose-volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were consistent

  11. M.C. simulation of GEM neutron beam monitor with 10B

    International Nuclear Information System (INIS)

    Wang Yanfeng; Sun Zhijia; Liu Ben; Zhou Jianrong; Yang Guian; Dong Jing; Xu Hong; Zhou Liang; Huang Guangming; Yang Lei; Li Yi

    2010-01-01

    The neutron beam monitor based on GEM detector has been carefully studied with the Monte-Carlo method in this article. The simulation framework is including the ANSYS and the Garfield, which was used to compute the electric field of GEM foils and simulate the movement of electrons in gas mixture respectively. The GEM foils' focus and extract coefficients have been obtained. According to the primary results, the performing of the monitor is improved. (authors)

  12. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    Energy Technology Data Exchange (ETDEWEB)

    Souris, Kevin, E-mail: kevin.souris@uclouvain.be; Lee, John Aldo [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and ICTEAM Institute, Université catholique de Louvain, Louvain-la-Neuve 1348 (Belgium); Sterpin, Edmond [Center for Molecular Imaging and Experimental Radiotherapy, Institut de Recherche Expérimentale et Clinique, Université catholique de Louvain, Avenue Hippocrate 54, 1200 Brussels, Belgium and Department of Oncology, Katholieke Universiteit Leuven, O& N I Herestraat 49, 3000 Leuven (Belgium)

    2016-04-15

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10{sup 7} primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  13. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures

    International Nuclear Information System (INIS)

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-01-01

    Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the GATE/GEANT4 Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with GATE/GEANT4 for various geometries show deviations within 2%–1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10"7 primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  14. Monte Carlo simulation of a medical accelerator: application on a heterogeneous phantom

    International Nuclear Information System (INIS)

    Serrano, B.; Franchisseur, E.; Hachem, A.; Herault, J.; Marcie, S.; Bensadoun, R.J.

    2005-01-01

    The objective of this study is to seek an accurate and efficient method to calculate the dose distribution for small fields in high gradient heterogeneity, typical for Intensity Modulated Radiation Therapy (IMRT) technique on head and neck regions. This motivates a Monte Carlo (MC) simulation of the photon beam for the two nominal potential energies of 25 and 6 MV delivered by a medical linear electron accelerator (Linac) used at the Centre Antoine Lacassagne. These investigations were checked by means of an ionization chamber (IC). Some first adjustments on parameters given by the manufacturer for the 25 and the 6 MV data have been applied to optimize the adjustment between the IC and the MC simulation on the depth-dose and the dose profile distributions. The good agreement between the MC calculated and the measured data are only obtained when the mean energies of the electron beams are respectively 15 MeV and 5.2 MeV and the corresponding spot size diameter 2 and 3 mm. Once the validation of the MC simulation of the Linac is overcome, these results permit us in a second part to check the calculation data given by a treatment planning system (TPS) on a heterogeneous phantom. The result shows some discrepancies up to 7% between TPS and MC simulation. Those differences come from a bad approximation of the material density by the TPS. These encouraging results of the MC simulation will permit us afterwards to check the dose deposition given by the TPS on IMRT treatment. (authors)

  15. Application of quality assurance to MC and A systems

    International Nuclear Information System (INIS)

    Skinner, A.J.; Delvin, W.L.

    1986-01-01

    Application of the principles of quality assurance to MC and A has been done at DOE's Savannah River Operations Office. The principles were applied to the functions within the MC and A Branch, including both the functions used to operate the Branch and those used to review the MC and A activities of DOE/SR's contractor. The purpose of this paper is to discuss that application of quality assurance and to show how the principles of quality assurance relate to the functions of a MC and A system, for both a DOE field office and a contractor. The principles (presented as requirements from the NQA-1 standard) are briefly discussed, a method for applying quality assurance is outlined, application at DOE/SR is shown, and application to a contractor's MC and A system is discussed

  16. Evaluation of accuracy and precision of a smartphone based automated parasite egg counting system in comparison to the McMaster and Mini-FLOTAC methods.

    Science.gov (United States)

    Scare, J A; Slusarewicz, P; Noel, M L; Wielgus, K M; Nielsen, M K

    2017-11-30

    Fecal egg counts are emphasized for guiding equine helminth parasite control regimens due to the rise of anthelmintic resistance. This, however, poses further challenges, since egg counting results are prone to issues such as operator dependency, method variability, equipment requirements, and time commitment. The use of image analysis software for performing fecal egg counts is promoted in recent studies to reduce the operator dependency associated with manual counts. In an attempt to remove operator dependency associated with current methods, we developed a diagnostic system that utilizes a smartphone and employs image analysis to generate automated egg counts. The aims of this study were (1) to determine precision of the first smartphone prototype, the modified McMaster and ImageJ; (2) to determine precision, accuracy, sensitivity, and specificity of the second smartphone prototype, the modified McMaster, and Mini-FLOTAC techniques. Repeated counts on fecal samples naturally infected with equine strongyle eggs were performed using each technique to evaluate precision. Triplicate counts on 36 egg count negative samples and 36 samples spiked with strongyle eggs at 5, 50, 500, and 1000 eggs per gram were performed using a second smartphone system prototype, Mini-FLOTAC, and McMaster to determine technique accuracy. Precision across the techniques was evaluated using the coefficient of variation. In regards to the first aim of the study, the McMaster technique performed with significantly less variance than the first smartphone prototype and ImageJ (psmartphone and ImageJ performed with equal variance. In regards to the second aim of the study, the second smartphone system prototype had significantly better precision than the McMaster (psmartphone system were 64.51%, 21.67%, and 32.53%, respectively. The Mini-FLOTAC was significantly more accurate than the McMaster (psmartphone system (psmartphone and McMaster counts did not have statistically different accuracies

  17. Taylor-expansion Monte Carlo simulations of classical fluids in the canonical and grand canonical ensemble

    International Nuclear Information System (INIS)

    Schoen, M.

    1995-01-01

    In this article the Taylor-expansion method is introduced by which Monte Carlo (MC) simulations in the canonical ensemble can be speeded up significantly, Substantial gains in computational speed of 20-40% over conventional implementations of the MC technique are obtained over a wide range of densities in homogeneous bulk phases. The basic philosophy behind the Taylor-expansion method is a division of the neighborhood of each atom (or molecule) into three different spatial zones. Interactions between atoms belonging to each zone are treated at different levels of computational sophistication. For example, only interactions between atoms belonging to the primary zone immediately surrounding an atom are treated explicitly before and after displacement. The change in the configurational energy contribution from secondary-zone interactions is obtained from the first-order term of a Taylor expansion of the configurational energy in terms of the displacement vector d. Interactions with atoms in the tertiary zone adjacent to the secondary zone are neglected throughout. The Taylor-expansion method is not restricted to the canonical ensemble but may be employed to enhance computational efficiency of MC simulations in other ensembles as well. This is demonstrated for grand canonical ensemble MC simulations of an inhomogeneous fluid which can be performed essentially on a modern personal computer

  18. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becchetti, M; Tian, X; Segars, P; Samei, E [Clinical Imaging Physics Group, Department of Radiology, Duke University Me, Durham, NC (United States)

    2015-06-15

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches.

  19. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    International Nuclear Information System (INIS)

    Becchetti, M; Tian, X; Segars, P; Samei, E

    2015-01-01

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches

  20. Comparison between McMaster and Mini-FLOTAC methods for the enumeration of Eimeria maxima oocysts in poultry excreta.

    Science.gov (United States)

    Bortoluzzi, C; Paras, K L; Applegate, T J; Verocai, G G

    2018-04-30

    Monitoring Eimeria shedding has become more important due to the recent restrictions to the use of antibiotics within the poultry industry. Therefore, there is a need for the implementation of more precise and accurate quantitative diagnostic techniques. The objective of this study was to compare the precision and accuracy between the Mini-FLOTAC and the McMaster techniques for quantitative diagnosis of Eimeria maxima oocyst in poultry. Twelve pools of excreta samples of broiler chickens experimentally infected with E. maxima were analyzed for the comparison between Mini-FLOTAC and McMaster technique using, the detection limits (dl) of 23 and 25, respectively. Additionally, six excreta samples were used to compare the precision of different dl (5, 10, 23, and 46) using the Mini-FLOTAC technique. For precision comparisons, five technical replicates of each sample (five replicate slides on one excreta slurry) were read for calculating the mean oocyst per gram of excreta (OPG) count, standard deviation (SD), coefficient of variation (CV), and precision of both aforementioned comparisons. To compare accuracy between the methods (McMaster, and Mini-FLOTAC dl 5 and 23), excreta from uninfected chickens was spiked with 100, 500, 1,000, 5,000, or 10,000 OPG; additional samples remained unspiked (negative control). For each spiking level, three samples were read in triplicate, totaling nine reads per spiking level per technique. Data were transformed using log10 to obtain normality and homogeneity of variances. A significant correlation (R = 0.74; p = 0.006) was observed between the mean OPG of the McMaster dl 25 and the Mini-FLOTAC dl 23. Mean OPG, CV, SD, and precision were not statistically different between the McMaster dl 25 and Mini-FLOTAC dl 23. Despite the absence of statistical difference (p > 0.05), Mini-FLOTAC dl 5 showed a numerically lower SD and CV than Mini-FLOTAC dl 23. The Pearson correlation coefficient revealed significant and positive

  1. OpenMC: A state-of-the-art Monte Carlo code for research and development

    International Nuclear Information System (INIS)

    Romano, Paul K.; Horelik, Nicholas E.; Herman, Bryan R.; Nelson, Adam G.; Forget, Benoit; Smith, Kord

    2015-01-01

    Highlights: • OpenMC is an open source Monte Carlo particle transport code. • Solid geometry and continuous-energy physics allow high-fidelity simulations. • Development has focused on high performance and modern I/O techniques. • OpenMC is capable of scaling up to hundreds of thousands of processors. • Other features include plotting, CMFD acceleration, and variance reduction. - Abstract: This paper gives an overview of OpenMC, an open source Monte Carlo particle transport code recently developed at the Massachusetts Institute of Technology. OpenMC uses continuous-energy cross sections and a constructive solid geometry representation, enabling high-fidelity modeling of nuclear reactors and other systems. Modern, portable input/output file formats are used in OpenMC: XML for input, and HDF5 for output. High performance parallel algorithms in OpenMC have demonstrated near-linear scaling to over 100,000 processors on modern supercomputers. Other topics discussed in this paper include plotting, CMFD acceleration, variance reduction, eigenvalue calculations, and software development processes

  2. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z [Reading Hospital, West Reading, PA (United States); Gao, M [ProCure Treatment Centers, Warrenville, IL (United States)

    2014-06-01

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm{sup 2}, 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers.

  3. SU-E-T-314: The Application of Cloud Computing in Pencil Beam Scanning Proton Therapy Monte Carlo Simulation

    International Nuclear Information System (INIS)

    Wang, Z; Gao, M

    2014-01-01

    Purpose: Monte Carlo simulation plays an important role for proton Pencil Beam Scanning (PBS) technique. However, MC simulation demands high computing power and is limited to few large proton centers that can afford a computer cluster. We study the feasibility of utilizing cloud computing in the MC simulation of PBS beams. Methods: A GATE/GEANT4 based MC simulation software was installed on a commercial cloud computing virtual machine (Linux 64-bits, Amazon EC2). Single spot Integral Depth Dose (IDD) curves and in-air transverse profiles were used to tune the source parameters to simulate an IBA machine. With the use of StarCluster software developed at MIT, a Linux cluster with 2–100 nodes can be conveniently launched in the cloud. A proton PBS plan was then exported to the cloud where the MC simulation was run. Results: The simulated PBS plan has a field size of 10×10cm 2 , 20cm range, 10cm modulation, and contains over 10,000 beam spots. EC2 instance type m1.medium was selected considering the CPU/memory requirement and 40 instances were used to form a Linux cluster. To minimize cost, master node was created with on-demand instance and worker nodes were created with spot-instance. The hourly cost for the 40-node cluster was $0.63 and the projected cost for a 100-node cluster was $1.41. Ten million events were simulated to plot PDD and profile, with each job containing 500k events. The simulation completed within 1 hour and an overall statistical uncertainty of < 2% was achieved. Good agreement between MC simulation and measurement was observed. Conclusion: Cloud computing is a cost-effective and easy to maintain platform to run proton PBS MC simulation. When proton MC packages such as GATE and TOPAS are combined with cloud computing, it will greatly facilitate the pursuing of PBS MC studies, especially for newly established proton centers or individual researchers

  4. McGET: A rapid image-based method to determine the morphological characteristics of gravels on the Gobi desert surface

    Science.gov (United States)

    Mu, Yue; Wang, Feng; Zheng, Bangyou; Guo, Wei; Feng, Yiming

    2018-03-01

    The relationship between morphological characteristics (e.g. gravel size, coverage, angularity and orientation) and local geomorphic features (e.g. slope gradient and aspect) of desert has been used to explore the evolution process of Gobi desert. Conventional quantification methods are time-consuming, inefficient and even prove impossible to determine the characteristics of large numbers of gravels. We propose a rapid image-based method to obtain the morphological characteristics of gravels on the Gobi desert surface, which is called the "morphological characteristics gained effectively technique" (McGET). The image of the Gobi desert surface was classified into gravel clusters and background by a machine-learning "classification and regression tree" (CART) algorithm. Then gravel clusters were segmented into individual gravel clasts by separating objects in images using a "watershed segmentation" algorithm. Thirdly, gravel coverage, diameter, aspect ratio and orientation were calculated based on the basic principles of 2D computer graphics. We validated this method with two independent datasets in which the gravel morphological characteristics were obtained from 2728 gravels measured in the field and 7422 gravels measured by manual digitization. Finally, we applied McGET to derive the spatial variation of gravel morphology on the Gobi desert along an alluvial-proluvial fan located in Hami, Xinjiang, China. The validated results show that the mean gravel diameter measured in the field agreed well with that calculated by McGET for large gravels (R2 = 0.89, P < 0.001). Compared to manual digitization, the McGET accuracies for gravel coverage, gravel diameter and aspect ratio were 97%, 83% and 96%, respectively. The orientation distributions calculated were consistent across two different methods. More importantly, McGET significantly shortens the time cost in obtaining gravel morphological characteristics in the field and laboratory. The spatial variation results

  5. TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma

    Energy Technology Data Exchange (ETDEWEB)

    Sisniega, A; Zbijewski, W; Stayman, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Yorkston, J [Carestream Health (United States); Aygun, N [Department of Radiology, Johns Hopkins University (United States); Koliatsos, V [Department of Neurology, Johns Hopkins University (United States); Siewerdsen, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Department of Radiology, Johns Hopkins University (United States)

    2014-06-15

    Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain

  6. TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma

    International Nuclear Information System (INIS)

    Sisniega, A; Zbijewski, W; Stayman, J; Yorkston, J; Aygun, N; Koliatsos, V; Siewerdsen, J

    2014-01-01

    Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain

  7. Molecular dynamics simulation based on the multi-component molecular orbital method: Application to H5O2+,D5O2+,andT5O2+

    International Nuclear Information System (INIS)

    Ishimoto, Takayoshi; Koyama, Michihisa

    2012-01-01

    Graphical abstract: Molecular dynamics method based on multi-component molecular orbital method was applied to basic hydrogen bonding systems, H 5 O 2 + , and its isotopomers (D 5 O 2 + andT 5 O 2 + ). Highlights: ► Molecular dynamics method with nuclear quantum effect was developed. ► Multi-component molecular orbital method was used as ab initio MO calculation. ► Developed method applied to basic hydrogen bonding system, H 5 O 2 + , and isotopomers. ► O ⋯ O vibrational stretching reflected to the distribution of protonic wavefunctions. ► H/D/T isotope effect was also analyzed. - Abstract: We propose a molecular dynamics (MD) method based on the multi-component molecular orbital (MC M O) method, which takes into account the quantum effect of proton directly, for the detailed analyses of proton transfer in hydrogen bonding system. The MC M O based MD (MC M O-MD) method is applied to the basic structures, H 5 O 2 + (called “Zundel ion”), and its isotopomers (D 5 O 2 + andT 5 O 2 + ). We clearly demonstrate the geometrical difference of hydrogen bonded O ⋯ O distance induced by H/D/T isotope effect because the O ⋯ O in H-compound was longer than that in D- or T-compound. We also find the strong relation between stretching vibration of O ⋯ O and the distribution of hydrogen bonded protonic wavefunction because the protonic wavefunction tends to delocalize when the O ⋯ O distance becomes short during the dynamics. Our proposed MC M O-MD simulation is expected as a powerful tool to analyze the proton dynamics in hydrogen bonding systems.

  8. Development of a polarized neutron beam line at Algerian research reactors using McStas software

    Energy Technology Data Exchange (ETDEWEB)

    Makhloufi, M., E-mail: makhloufi_8m@yahoo.fr [Centre de Recherche Nucléaire de Birine (Algeria); Salah, H. [Centre de Recherche Nucléaire d' Alger (Algeria)

    2017-02-01

    Unpolarized instrumentation has long been studied and designed using McStas simulation tool. But, only recently new models were developed for McStas to simulate polarized neutron scattering instruments. In the present contribution, we used McStas software to design a polarized neutron beam line, taking advantage of the available spectrometers reflectometer and diffractometer in Algeria. Both thermal and cold neutron was considered. The polarization was made by two types of supermirrors polarizers FeSi and CoCu provided by the HZB institute. For sake of performance and comparison, the polarizers were characterized and their characteristics reproduced. The simulated instruments are reported. Flipper and electromagnets for guide field are developed. Further developments including analyzers and upgrading of the existing spectrometers are underway. - Highlights: • Permit to evaluate the feasibility of a polarized neutron scattering instrument prior to its implementation. • Help to understand the origin of instrumental imperfections and offer an optimized set up configuration. • Provide the possibility to use the FeSi and CoCu supermirrors, designed to polarize spin up cold neutron, to polarize thermal neutron.

  9. Development of a polarized neutron beam line at Algerian research reactors using McStas software

    International Nuclear Information System (INIS)

    Makhloufi, M.; Salah, H.

    2017-01-01

    Unpolarized instrumentation has long been studied and designed using McStas simulation tool. But, only recently new models were developed for McStas to simulate polarized neutron scattering instruments. In the present contribution, we used McStas software to design a polarized neutron beam line, taking advantage of the available spectrometers reflectometer and diffractometer in Algeria. Both thermal and cold neutron was considered. The polarization was made by two types of supermirrors polarizers FeSi and CoCu provided by the HZB institute. For sake of performance and comparison, the polarizers were characterized and their characteristics reproduced. The simulated instruments are reported. Flipper and electromagnets for guide field are developed. Further developments including analyzers and upgrading of the existing spectrometers are underway. - Highlights: • Permit to evaluate the feasibility of a polarized neutron scattering instrument prior to its implementation. • Help to understand the origin of instrumental imperfections and offer an optimized set up configuration. • Provide the possibility to use the FeSi and CoCu supermirrors, designed to polarize spin up cold neutron, to polarize thermal neutron.

  10. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.

    Science.gov (United States)

    Souris, Kevin; Lee, John Aldo; Sterpin, Edmond

    2016-04-01

    Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4 Monte Carlo application for homogeneous and heterogeneous geometries. Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 10(7) primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification.

  11. α-Skew π-McCoy Rings

    Directory of Open Access Journals (Sweden)

    Areej M. Abduldaim

    2013-01-01

    Full Text Available As a generalization of α-skew McCoy rings, we introduce the concept of α-skew π-McCoy rings, and we study the relationships with another two new generalizations, α-skew π1-McCoy rings and α-skew π2-McCoy rings, observing the relations with α-skew McCoy rings, π-McCoy rings, α-skew Armendariz rings, π-regular rings, and other kinds of rings. Also, we investigate conditions such that α-skew π1-McCoy rings imply α-skew π-McCoy rings and α-skew π2-McCoy rings. We show that in the case where R is a nonreduced ring, if R is 2-primal, then R is an α-skew π-McCoy ring. And, let R be a weak (α,δ-compatible ring; if R is an α-skew π1-McCoy ring, then R is α-skew π2-McCoy.

  12. Multimedia transmission in MC-CDMA using adaptive subcarrier power allocation and CFO compensation

    Science.gov (United States)

    Chitra, S.; Kumaratharan, N.

    2018-02-01

    Multicarrier code division multiple access (MC-CDMA) system is one of the most effective techniques in fourth-generation (4G) wireless technology, due to its high data rate, high spectral efficiency and resistance to multipath fading. However, MC-CDMA systems are greatly deteriorated by carrier frequency offset (CFO) which is due to Doppler shift and oscillator instabilities. It leads to loss of orthogonality among the subcarriers and causes intercarrier interference (ICI). Water filling algorithm (WFA) is an efficient resource allocation algorithm to solve the power utilisation problems among the subcarriers in time-dispersive channels. The conventional WFA fails to consider the effect of CFO. To perform subcarrier power allocation with reduced CFO and to improve the capacity of MC-CDMA system, residual CFO compensated adaptive subcarrier power allocation algorithm is proposed in this paper. The proposed technique allocates power only to subcarriers with high channel to noise power ratio. The performance of the proposed method is evaluated using random binary data and image as source inputs. Simulation results depict that the bit error rate performance and ICI reduction capability of the proposed modified WFA offered superior performance in both power allocation and image compression for high-quality multimedia transmission in the presence of CFO and imperfect channel state information conditions.

  13. ATLAS tunes of PYTHIA 6 and Pythia 8 for MC11

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    We present the latest developments of the ATLAS MC generator tuning project for the Pythia family of event generators, including the C++ Pythia 8 code for the first time. The PYTHIA 6 tunes presented here, titled AMBT2B and AUET2B and constructed for a variety of PDFs, constitute alternatives to the AMBT2/AUET2 tunes previously presented as a candidate for MC11 event simulation. They systematically differ from the AMBT2/AUET2 PYTHIA 6 tunes in the treatment of alpha_S, to address concerns with those tunes. Systematic tune variations are also presented. The Pythia 8 tunes have been constructed for two different PDFs, and are aimed at an optimal description of minimum bias, for use in pile-up simulation. PDF-sensitive effects are observed and discussed in the MPI tunings of both generators.

  14. An Interview with Joe McMann: His Life Lessons

    Science.gov (United States)

    McMann, Joe

    2011-01-01

    Pica Kahn conducted "An Interview with Joe McMann: His Life Lessons" on May 23, 2011. With over 40 years of experience in the aerospace industry, McMann has gained a wealth of knowledge. Many have been interested in his biography, progression of work at NASA, impact on the U.S. spacesuit, and career accomplishments. This interview highlighted the influences and decision-making methods that impacted his technical and management contributions to the space program. McMann shared information about the accomplishments and technical advances that committed individuals can make.

  15. MC 68020 μp architecture

    International Nuclear Information System (INIS)

    Casals, O.; Dejuan, E.; Labarta, J.

    1988-01-01

    The MC68020 is a 32-bit microprocessor object code compatible with the earlier MC68000 and MC68010. In this paper we describe its architecture and two coprocessors: the MC68851 paged memory management unit and the MC68882 floating point coprocessor. Between its most important characteristics we can point up: addressing mode extensions for enhanced support of high level languages, an on-chip instruction cache and full support of virtual memory. (Author)

  16. PRELIMINARY COUPLING OF THE MONTE CARLO CODE OPENMC AND THE MULTIPHYSICS OBJECT-ORIENTED SIMULATION ENVIRONMENT (MOOSE) FOR ANALYZING DOPPLER FEEDBACK IN MONTE CARLO SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Ellis; Derek Gaston; Benoit Forget; Kord Smith

    2011-07-01

    In recent years the use of Monte Carlo methods for modeling reactors has become feasible due to the increasing availability of massively parallel computer systems. One of the primary challenges yet to be fully resolved, however, is the efficient and accurate inclusion of multiphysics feedback in Monte Carlo simulations. The research in this paper presents a preliminary coupling of the open source Monte Carlo code OpenMC with the open source Multiphysics Object-Oriented Simulation Environment (MOOSE). The coupling of OpenMC and MOOSE will be used to investigate efficient and accurate numerical methods needed to include multiphysics feedback in Monte Carlo codes. An investigation into the sensitivity of Doppler feedback to fuel temperature approximations using a two dimensional 17x17 PWR fuel assembly is presented in this paper. The results show a functioning multiphysics coupling between OpenMC and MOOSE. The coupling utilizes Functional Expansion Tallies to accurately and efficiently transfer pin power distributions tallied in OpenMC to unstructured finite element meshes used in MOOSE. The two dimensional PWR fuel assembly case also demonstrates that for a simplified model the pin-by-pin doppler feedback can be adequately replicated by scaling a representative pin based on pin relative powers.

  17. A simple method to predict regional fish abundance: an example in the McKenzie River Basin, Oregon

    Science.gov (United States)

    D.J. McGarvey; J.M. Johnston

    2011-01-01

    Regional assessments of fisheries resources are increasingly called for, but tools with which to perform them are limited. We present a simple method that can be used to estimate regional carrying capacity and apply it to the McKenzie River Basin, Oregon. First, we use a macroecological model to predict trout densities within small, medium, and large streams in the...

  18. Is McMurray′s osteotomy obsolete?

    Directory of Open Access Journals (Sweden)

    Phaltankar P

    1995-10-01

    Full Text Available A review of the method of performing, advantages, disadvantages of McMurray′s displacement osteotomy with regard to treatment of nonunion of transcervical fracture neck femur with viable femoral head was carried out in this study of ten cases, in view of the abandonment of the procedure in favour of angulation osteotomy. Good results obtained in the series attest to the usefulness of McMurray′s osteotomy in the difficult problem of nonunion of transcervical fracture neck femur in well selected cases with certain advantages over the angulation osteotomy due to the ′Armchair effect′.

  19. Methods for simulating turbulent phase screen

    International Nuclear Information System (INIS)

    Zhang Jianzhu; Zhang Feizhou; Wu Yi

    2012-01-01

    Some methods for simulating turbulent phase screen are summarized, and their characteristics are analyzed by calculating the phase structure function, decomposing phase screens into Zernike polynomials, and simulating laser propagation in the atmosphere. Through analyzing, it is found that, the turbulent high-frequency components are well contained by those phase screens simulated by the FFT method, but the low-frequency components are little contained. The low-frequency components are well contained by screens simulated by Zernike method, but the high-frequency components are not contained enough. The high frequency components contained will be improved by increasing the order of the Zernike polynomial, but they mainly lie in the edge-area. Compared with the two methods above, the fractal method is a better method to simulate turbulent phase screens. According to the radius of the focal spot and the variance of the focal spot jitter, there are limitations in the methods except the fractal method. Combining the FFT and Zernike method or combining the FFT method and self-similar theory to simulate turbulent phase screens is an effective and appropriate way. In general, the fractal method is probably the best way. (authors)

  20. Analysis of the accuracy and precision of the McMaster method in detection of the eggs of Toxocara and Trichuris species (Nematoda) in dog faeces.

    Science.gov (United States)

    Kochanowski, Maciej; Dabrowska, Joanna; Karamon, Jacek; Cencek, Tomasz; Osiński, Zbigniew

    2013-07-01

    The aim of this study was to determine the accuracy and precision of McMaster method with Raynaud's modification in the detection of the eggs of the nematodes Toxocara canis (Werner, 1782) and Trichuris ovis (Abildgaard, 1795) in faeces of dogs. Four variants of McMaster method were used for counting: in one grid, two grids, the whole McMaster chamber and flotation in the tube. One hundred sixty samples were prepared from dog faeces (20 repetitions for each egg quantity) containing 15, 25, 50, 100, 150, 200, 250 and 300 eggs of T. canis and T. ovis in 1 g of faeces. To compare the influence of kind of faeces on the results, samples of dog faeces were enriched at the same levels with the eggs of another nematode, Ascaris suum Goeze, 1782. In addition, 160 samples of pig faeces were prepared and enriched only with A. suum eggs in the same way. The highest limit of detection (the lowest level of eggs that were detected in at least 50% of repetitions) in all McMaster chamber variants were obtained for T. canis eggs (25-250 eggs/g faeces). In the variant with flotation in the tube, the highest limit of detection was obtained for T. ovis eggs (100 eggs/g). The best results of the limit of detection, sensitivity and the lowest coefficients of variation were obtained with the use of the whole McMaster chamber variant. There was no significant impact of properties of faeces on the obtained results. Multiplication factors for the whole chamber were calculated on the basis of the transformed equation of the regression line, illustrating the relationship between the number of detected eggs and that of the eggs added to the'sample. Multiplication factors calculated for T. canis and T. ovis eggs were higher than those expected using McMaster method with Raynaud modification.

  1. Inhibition of serotonin transport by (+)McN5652 is noncompetitive

    Energy Technology Data Exchange (ETDEWEB)

    Hummerich, Rene [Biochemical Laboratory, Central Institute of Mental Health, 68159 Mannheim (Germany); Schulze, Oliver [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Raedler, Thomas [Department of Psychiatry and Psychotherapy, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Mikecz, Pal [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Reimold, Matthias [Department of Nuclear Medicine, University Hospital Tuebingen, D-72076 Tuebingen (Germany); Brenner, Winfried [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Clausen, Malte [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany); Schloss, Patrick [Biochemical Laboratory, Central Institute of Mental Health, 68159 Mannheim (Germany); Buchert, Ralph [Department of Nuclear Medicine, University Medical Center Hamburg-Eppendorf, D-20246 Hamburg (Germany)]. E-mail: buchert@uke.uni-hamburg.de

    2006-04-15

    Introduction: Imaging of the serotonergic innervation of the brain using positron emission tomography (PET) with the serotonin transporter (SERT) ligand [{sup 11C}] (+)McN5652 might be affected by serotonin in the synaptic cleft if there is relevant interaction between [{sup 11}C] (+)McN5652 and serotonin at the SERT. The aim of the present study therefore was to pharmacologically characterize the interaction of [{sup 11}C] (+)McN5652 and serotonin at the SERT. Methods: In vitro saturation analyses of [{sup 3}H]serotonin uptake into HEK293 cells stably expressing the human SERT were performed in the absence and presence of unlabelled (+)McN5652. Data were evaluated assuming Michaelis-Menten kinetics. Results: Unlabelled (+)McN5652 significantly reduced the maximal rate of serotonin transport V {sub max} of SERT without affecting the Michaelis-Menten constant K {sub M}. Conclusions: This finding indicates that (+)McN5652 inhibits serotonin transport through the SERT in a noncompetitive manner. This might suggest that [{sup 11}C] (+)McN5652 PET is not significantly affected by endogenous serotonin.

  2. Inhibition of serotonin transport by (+)McN5652 is noncompetitive

    International Nuclear Information System (INIS)

    Hummerich, Rene; Schulze, Oliver; Raedler, Thomas; Mikecz, Pal; Reimold, Matthias; Brenner, Winfried; Clausen, Malte; Schloss, Patrick; Buchert, Ralph

    2006-01-01

    Introduction: Imaging of the serotonergic innervation of the brain using positron emission tomography (PET) with the serotonin transporter (SERT) ligand [ 11C ] (+)McN5652 might be affected by serotonin in the synaptic cleft if there is relevant interaction between [ 11 C] (+)McN5652 and serotonin at the SERT. The aim of the present study therefore was to pharmacologically characterize the interaction of [ 11 C] (+)McN5652 and serotonin at the SERT. Methods: In vitro saturation analyses of [ 3 H]serotonin uptake into HEK293 cells stably expressing the human SERT were performed in the absence and presence of unlabelled (+)McN5652. Data were evaluated assuming Michaelis-Menten kinetics. Results: Unlabelled (+)McN5652 significantly reduced the maximal rate of serotonin transport V max of SERT without affecting the Michaelis-Menten constant K M . Conclusions: This finding indicates that (+)McN5652 inhibits serotonin transport through the SERT in a noncompetitive manner. This might suggest that [ 11 C] (+)McN5652 PET is not significantly affected by endogenous serotonin

  3. Hybrid simulation of scatter intensity in industrial cone-beam computed tomography

    International Nuclear Information System (INIS)

    Thierry, R.; Miceli, A.; Hofmann, J.; Flisch, A.; Sennhauser, U.

    2009-01-01

    A cone-beam computed tomography (CT) system using a 450 kV X-ray tube has been developed to challenge the three-dimensional imaging of parts of the automotive industry in short acquisition time. Because the probability of detecting scattered photons is high regarding the energy range and the area of detection, a scattering correction becomes mandatory for generating reliable images with enhanced contrast detectability. In this paper, we present a hybrid simulator for the fast and accurate calculation of the scattering intensity distribution. The full acquisition chain, from the generation of a polyenergetic photon beam, its interaction with the scanned object and the energy deposit in the detector is simulated. Object phantoms can be spatially described in form of voxels, mathematical primitives or CAD models. Uncollided radiation is treated with a ray-tracing method and scattered radiation is split into single and multiple scattering. The single scattering is calculated with a deterministic approach accelerated with a forced detection method. The residual noisy signal is subsequently deconvoluted with the iterative Richardson-Lucy method. Finally the multiple scattering is addressed with a coarse Monte Carlo (MC) simulation. The proposed hybrid method has been validated on aluminium phantoms with varying size and object-to-detector distance, and found in good agreement with the MC code Geant4. The acceleration achieved by the hybrid method over the standard MC on a single projection is approximately of three orders of magnitude.

  4. MCMEG: Simulations of both PDD and TPR for 6 MV LINAC photon beam using different MC codes

    International Nuclear Information System (INIS)

    Fonseca, T.C.F.; Mendes, B.M.; Lacerda, M.A.S.; Silva, L.A.C.; Paixão, L.

    2017-01-01

    The Monte Carlo Modelling Expert Group (MCMEG) is an expert network specializing in Monte Carlo radiation transport and the modelling and simulation applied to the radiation protection and dosimetry research field. For the first inter-comparison task the group launched an exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes available within their laboratories and validate their simulated results by comparing them with experimental measurements carried out in the National Cancer Institute (INCA) in Rio de Janeiro, Brazil. The experimental measurements were performed using an ionization chamber with calibration traceable to a Secondary Standard Dosimetry Laboratory (SSDL). The detector was immersed in a water phantom at different depths and was irradiated with a radiation field size of 10×10 cm 2 . This exposure setup was used to determine the dosimetric parameters Percentage Depth Dose (PDD) and Tissue Phantom Ratio (TPR). The validation process compares the MC calculated results to the experimental measured PDD20,10 and TPR20,10. Simulations were performed reproducing the experimental TPR20,10 quality index which provides a satisfactory description of both the PDD curve and the transverse profiles at the two depths measured. This paper reports in detail the modelling process using MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes, the source and tally descriptions, the validation processes and the results. - Highlights: • MCMEG is an expert network specializing in Monte Carlo radiation transport. • MCNPx, MCNP6, EGSnrc and Penelope Monte Carlo codes are used. • Exercise to model and simulate a 6 MV LINAC photon beam using the Monte Carlo codes. • The PDD 20,10 and TPR 20,10 dosimetric parameters were compared with real data. • The paper reports in the modelling process using different Monte Carlo codes.

  5. Simulation of a complete inelastic neutron scattering experiment

    DEFF Research Database (Denmark)

    Edwards, H.; Lefmann, K.; Lake, B.

    2002-01-01

    A simulation of an inelastic neutron scattering experiment on the high-temperature superconductor La2-xSrxCuO4 is presented. The complete experiment, including sample, is simulated using an interface between the experiment control program and the simulation software package (McStas) and is compared...... with the experimental data. Simulating the entire experiment is an attractive alternative to the usual method of convoluting the model cross section with the resolution function, especially if the resolution function is nontrivial....

  6. Simulation of dose deposition in stereotactic synchrotron radiation therapy: a fast approach combining Monte Carlo and deterministic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Smekens, F; Freud, N; Letang, J M; Babot, D [CNDRI (Nondestructive Testing using Ionizing Radiations) Laboratory, INSA-Lyon, 69621 Villeurbanne Cedex (France); Adam, J-F; Elleaume, H; Esteve, F [INSERM U-836, Equipe 6 ' Rayonnement Synchrotron et Recherche Medicale' , Institut des Neurosciences de Grenoble (France); Ferrero, C; Bravin, A [European Synchrotron Radiation Facility, Grenoble (France)], E-mail: francois.smekens@insa-lyon.fr

    2009-08-07

    A hybrid approach, combining deterministic and Monte Carlo (MC) calculations, is proposed to compute the distribution of dose deposited during stereotactic synchrotron radiation therapy treatment. The proposed approach divides the computation into two parts: (i) the dose deposited by primary radiation (coming directly from the incident x-ray beam) is calculated in a deterministic way using ray casting techniques and energy-absorption coefficient tables and (ii) the dose deposited by secondary radiation (Rayleigh and Compton scattering, fluorescence) is computed using a hybrid algorithm combining MC and deterministic calculations. In the MC part, a small number of particle histories are simulated. Every time a scattering or fluorescence event takes place, a splitting mechanism is applied, so that multiple secondary photons are generated with a reduced weight. The secondary events are further processed in a deterministic way, using ray casting techniques. The whole simulation, carried out within the framework of the Monte Carlo code Geant4, is shown to converge towards the same results as the full MC simulation. The speed of convergence is found to depend notably on the splitting multiplicity, which can easily be optimized. To assess the performance of the proposed algorithm, we compare it to state-of-the-art MC simulations, accelerated by the track length estimator technique (TLE), considering a clinically realistic test case. It is found that the hybrid approach is significantly faster than the MC/TLE method. The gain in speed in a test case was about 25 for a constant precision. Therefore, this method appears to be suitable for treatment planning applications.

  7. Personal Background Interview of Jim McBarron

    Science.gov (United States)

    McBarron, Jim; Wright, Rebecca

    2012-01-01

    Jim McBarron exhibits a wealth of knowledge gathered from more than 40 years of experience with NASA, EVA, and spacesuits. His biography, progression of work at NASA, impact on EVA and the U.S. spacesuit, and career accomplishments are of interest to many. Wright, from the JSC History Office, conducted a personal background interview with McBarron. This interview highlighted the influences and decision-making methods that impacted McBarron's technical and management contributions to the space program. Attendees gained insight on the external and internal NASA influences on career progression within the EVA and spacesuit, and the type of accomplishments and technical advances that committed individuals can make. He concluded the presentation with a question and answer period that included a brief discussion about close calls and Russian spacesuits.

  8. Vector Monte Carlo simulations on atmospheric scattering of polarization qubits.

    Science.gov (United States)

    Li, Ming; Lu, Pengfei; Yu, Zhongyuan; Yan, Lei; Chen, Zhihui; Yang, Chuanghua; Luo, Xiao

    2013-03-01

    In this paper, a vector Monte Carlo (MC) method is proposed to study the influence of atmospheric scattering on polarization qubits for satellite-based quantum communication. The vector MC method utilizes a transmittance method to solve the photon free path for an inhomogeneous atmosphere and random number sampling to determine whether the type of scattering is aerosol scattering or molecule scattering. Simulations are performed for downlink and uplink. The degrees and the rotations of polarization are qualitatively and quantitatively obtained, which agree well with the measured results in the previous experiments. The results show that polarization qubits are well preserved in the downlink and uplink, while the number of received single photons is less than half of the total transmitted single photons for both links. Moreover, our vector MC method can be applied for the scattering of polarized light in other inhomogeneous random media.

  9. Optimal Spatial Subdivision method for improving geometry navigation performance in Monte Carlo particle transport simulation

    International Nuclear Information System (INIS)

    Chen, Zhenping; Song, Jing; Zheng, Huaqing; Wu, Bin; Hu, Liqin

    2015-01-01

    Highlights: • The subdivision combines both advantages of uniform and non-uniform schemes. • The grid models were proved to be more efficient than traditional CSG models. • Monte Carlo simulation performance was enhanced by Optimal Spatial Subdivision. • Efficiency gains were obtained for realistic whole reactor core models. - Abstract: Geometry navigation is one of the key aspects of dominating Monte Carlo particle transport simulation performance for large-scale whole reactor models. In such cases, spatial subdivision is an easily-established and high-potential method to improve the run-time performance. In this study, a dedicated method, named Optimal Spatial Subdivision, is proposed for generating numerically optimal spatial grid models, which are demonstrated to be more efficient for geometry navigation than traditional Constructive Solid Geometry (CSG) models. The method uses a recursive subdivision algorithm to subdivide a CSG model into non-overlapping grids, which are labeled as totally or partially occupied, or not occupied at all, by CSG objects. The most important point is that, at each stage of subdivision, a conception of quality factor based on a cost estimation function is derived to evaluate the qualities of the subdivision schemes. Only the scheme with optimal quality factor will be chosen as the final subdivision strategy for generating the grid model. Eventually, the model built with the optimal quality factor will be efficient for Monte Carlo particle transport simulation. The method has been implemented and integrated into the Super Monte Carlo program SuperMC developed by FDS Team. Testing cases were used to highlight the performance gains that could be achieved. Results showed that Monte Carlo simulation runtime could be reduced significantly when using the new method, even as cases reached whole reactor core model sizes

  10. Improved algorithms and advanced features of the CAD to MC conversion tool McCad

    International Nuclear Information System (INIS)

    Lu, L.; Fischer, U.; Pereslavtsev, P.

    2014-01-01

    Highlights: •The latest improvements of the McCad conversion approach including decomposition and void filling algorithms is presented. •An advanced interface for the materials editing and assignment has been developed and added to the McCAD GUI. •These improvements have been tested and successfully applied to DEMO and ITER NBI (Neutral Beam Injector) applications. •The performance of the CAD model conversion process is shown to be significantly improved. -- Abstract: McCad is a geometry conversion tool developed at KIT to enable the automatic bi-directional conversions of CAD models into the Monte Carlo (MC) geometries utilized for neutronics calculations (CAD to MC) and, reversed (MC to CAD), for visualization purposes. The paper presents the latest improvements of the conversion algorithms including improved decomposition, void filling and an advanced interface for the materials editing and assignment. The new implementations and features were tested on fusion neutronics applications to the DEMO and ITER NBI (Neutral Beam Injector) models. The results demonstrate greater stability and enhanced efficiency of McCad conversion process

  11. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  12. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  13. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  14. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    Science.gov (United States)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  15. McKenzie River Subbasin Assessment, Summary Report 2000.

    Energy Technology Data Exchange (ETDEWEB)

    Alsea Geospatial, Inc.

    2000-02-01

    This document summarizes the findings of the McKenzie River Subbasin Assessment: Technical Report. The subbasin assessment tells a story about the McKenzie River watershed. What is the McKenzie's ecological history, how is the McKenzie doing today, and where is the McKenzie watershed headed ecologically? Knowledge is a good foundation for action. The more we know, the better prepared we are to make decisions about the future. These decisions involve both protecting good remaining habitat and repairing some of the parts that are broken in the McKenzie River watershed. The subbasin assessment is the foundation for conservation strategy and actions. It provides a detailed ecological assessment of the lower McKenzie River and floodplain, identifies conservation and restoration opportunities, and discusses the influence of some upstream actions and processes on the study area. The assessment identifies restoration opportunities at the reach level. In this study, a reach is a river segment from 0.7 to 2.7 miles long and is defined by changes in land forms, land use, stream junctions, and/or cultural features. The assessment also provides flexible tools for setting priorities and planning projects. The goal of this summary is to clearly and concisely extract the key issues, findings, and recommendations from the full-length Technical Report. The high priority recommended action items highlight areas that the McKenzie Watershed Council can significantly influence, and that will likely yield the greatest ecological benefit. People are encouraged to read the full Technical Report if they are interested in the detailed methods, findings, and references used in this study.

  16. Multi-Scale Coupling Between Monte Carlo Molecular Simulation and Darcy-Scale Flow in Porous Media

    KAUST Repository

    Saad, Ahmed Mohamed; Kadoura, Ahmad Salim; Sun, Shuyu

    2016-01-01

    In this work, an efficient coupling between Monte Carlo (MC) molecular simulation and Darcy-scale flow in porous media is presented. The cell centered finite difference method with non-uniform rectangular mesh were used to discretize the simulation

  17. PETSTEP: Generation of synthetic PET lesions for fast evaluation of segmentation methods

    Science.gov (United States)

    Berthon, Beatrice; Häggström, Ida; Apte, Aditya; Beattie, Bradley J.; Kirov, Assen S.; Humm, John L.; Marshall, Christopher; Spezi, Emiliano; Larsson, Anne; Schmidtlein, C. Ross

    2016-01-01

    Purpose This work describes PETSTEP (PET Simulator of Tracers via Emission Projection): a faster and more accessible alternative to Monte Carlo (MC) simulation generating realistic PET images, for studies assessing image features and segmentation techniques. Methods PETSTEP was implemented within Matlab as open source software. It allows generating three-dimensional PET images from PET/CT data or synthetic CT and PET maps, with user-drawn lesions and user-set acquisition and reconstruction parameters. PETSTEP was used to reproduce images of the NEMA body phantom acquired on a GE Discovery 690 PET/CT scanner, and simulated with MC for the GE Discovery LS scanner, and to generate realistic Head and Neck scans. Finally the sensitivity (S) and Positive Predictive Value (PPV) of three automatic segmentation methods were compared when applied to the scanner-acquired and PETSTEP-simulated NEMA images. Results PETSTEP produced 3D phantom and clinical images within 4 and 6 min respectively on a single core 2.7 GHz computer. PETSTEP images of the NEMA phantom had mean intensities within 2% of the scanner-acquired image for both background and largest insert, and 16% larger background Full Width at Half Maximum. Similar results were obtained when comparing PETSTEP images to MC simulated data. The S and PPV obtained with simulated phantom images were statistically significantly lower than for the original images, but led to the same conclusions with respect to the evaluated segmentation methods. Conclusions PETSTEP allows fast simulation of synthetic images reproducing scanner-acquired PET data and shows great promise for the evaluation of PET segmentation methods. PMID:26321409

  18. Monte Carlo simulation and gaussian broaden techniques for full energy peak of characteristic X-ray in EDXRF

    International Nuclear Information System (INIS)

    Li Zhe; Liu Min; Shi Rui; Wu Xuemei; Tuo Xianguo

    2012-01-01

    Background: Non-standard analysis (NSA) technique is one of the most important development directions of energy dispersive X-ray fluorescence (EDXRF). Purpose: This NSA technique is mainly based on Monte Carlo (MC) simulation and full energy peak broadening, which were studied preliminarily in this paper. Methods: A kind of MC model was established for Si-PIN based EDXRF setup, and the flux spectra were obtained for iron ore sample. Finally, the flux spectra were broadened by Gaussian broaden parameters calculated by a new method proposed in this paper, and the broadened spectra were compared with measured energy spectra. Results: MC method can be used to simulate EDXRF measurement, and can correct the matrix effects among elements automatically. Peak intensities can be obtained accurately by using the proposed Gaussian broaden technique. Conclusions: This study provided a key technique for EDXRF to achieve advanced NSA technology. (authors)

  19. NMR diffusion simulation based on conditional random walk.

    Science.gov (United States)

    Gudbjartsson, H; Patz, S

    1995-01-01

    The authors introduce here a new, very fast, simulation method for free diffusion in a linear magnetic field gradient, which is an extension of the conventional Monte Carlo (MC) method or the convolution method described by Wong et al. (in 12th SMRM, New York, 1993, p.10). In earlier NMR-diffusion simulation methods, such as the finite difference method (FD), the Monte Carlo method, and the deterministic convolution method, the outcome of the calculations depends on the simulation time step. In the authors' method, however, the results are independent of the time step, although, in the convolution method the step size has to be adequate for spins to diffuse to adjacent grid points. By always selecting the largest possible time step the computation time can therefore be reduced. Finally the authors point out that in simple geometric configurations their simulation algorithm can be used to reduce computation time in the simulation of restricted diffusion.

  20. On the thermodynamics of the McMillan-Mayer state function

    DEFF Research Database (Denmark)

    Mollerup, Jørgen; Breil, Martin Peter

    2009-01-01

    to develop the McMillan-Mayer framework in a classical thermodynamic context for which we develop the relationship between the state function of the McMillan-Mayer framework and the Helmholtz state function. A Taylor expansion method can be applied to the osmotic pressure of a solution which is dilute...

  1. Occurrence of the Microcystins MC-LW and MC-LF in Dutch Surface Waters and Their Contribution to Total Microcystin Toxicity

    Directory of Open Access Journals (Sweden)

    Elisabeth J. Faassen

    2013-07-01

    Full Text Available Microcystins (MCs are the most frequently found cyanobacterial toxins in freshwater systems. Many MC variants have been identified and variants differ in their toxicity. Recent studies showed that the variants MC-LW and MC-LF might be more toxic than MC-LR, the variant that is most abundant and mostly used for risk assessments. As little is known about the presence of these two variants in The Netherlands, we determined their occurrence by analyzing 88 water samples and 10 scum samples for eight MC variants ((dm-7-MC-RR, MC-YR, (dm-7-MC-LR, MC-LY, MC-LW and MC-LF by liquid chromatography with tandem mass spectrometry detection. All analyzed MC variants were detected, and MC-LW and/or MC-LF were present in 32% of the MC containing water samples. When MC-LW and MC-LF were present, they contributed to nearly 10% of the total MC concentrations, but due to their suspected high toxicity, their average contribution to the total MC toxicity was estimated to be at least 45%. Given the frequent occurrence and possible high toxicity of MC-LW and MC-LF, it seems better to base health risk assessments on the toxicity contributions of different MC variants than on MC-LR concentrations alone.

  2. Simulations of NMR pulse sequences during equilibrium and non-equilibrium chemical exchange

    International Nuclear Information System (INIS)

    Helgstrand, Magnus; Haerd, Torleif; Allard, Peter

    2000-01-01

    The McConnell equations combine the differential equations for a simple two-state chemical exchange process with the Bloch differential equations for a classical description of the behavior of nuclear spins in a magnetic field. This equation system provides a useful starting point for the analysis of slow, intermediate and fast chemical exchange studied using a variety of NMR experiments. The McConnell equations are in the mathematical form of an inhomogeneous system of first-order differential equations. Here we rewrite the McConnell equations in a homogeneous form in order to facilitate fast and simple numerical calculation of the solution to the equation system. The McConnell equations can only treat equilibrium chemical exchange. We therefore also present a homogeneous equation system that can handle both equilibrium and non-equilibrium chemical processes correctly, as long as the kinetics is of first-order. Finally, the same method of rewriting the inhomogeneous form of the McConnell equations into a homogeneous form is applied to a quantum mechanical treatment of a spin system in chemical exchange. In order to illustrate the homogeneous McConnell equations, we have simulated pulse sequences useful for measuring exchange rates in slow, intermediate and fast chemical exchange processes. A stopped-flow NMR experiment was simulated using the equations for non-equilibrium chemical exchange. The quantum mechanical treatment was tested by the simulation of a sensitivity enhanced 15 N-HSQC with pulsed field gradients during slow chemical exchange and by the simulation of the transfer efficiency of a two-dimensional heteronuclear cross-polarization based experiment as a function of both chemical shift difference and exchange rate constants

  3. Metal ion-mediated agonism and agonist enhancement in melanocortin MC1 and MC4 receptors

    DEFF Research Database (Denmark)

    Holst, Birgitte; Elling, Christian E; Schwartz, Thue W

    2002-01-01

    -melanocortin stimulating hormone (alpha-MSH) in the MC1 and MC4 receptors, respectively. In the presence of peptide agonist, Zn(II) acted as an enhancer on both receptors, because it shifted the dose-response curves to the left: most pronounced was a 6-fold increase in alpha-MSH potency on the MC1 receptor. The effect......An endogenous metal-ion site in the melanocortin MC1 and MC4 receptors was characterized mainly in transiently transfected COS-7 cells. ZnCl(2) alone stimulated signaling through the Gs pathway with a potency of 11 and 13 microm and an efficacy of 50 and 20% of that of alpha...... affinities and profiles were similar for a number of the 2,2'-bipyridine and 1,10-phenanthroline analogs in complex with Zn(II) in the MC1 and MC4 receptors. However, the potencies and efficacies of the metal-ion complexes were very different in the two receptors, and close to full agonism was obtained...

  4. McGill wetland model: evaluation of a peatland carbon simulator developed for global assessments

    Directory of Open Access Journals (Sweden)

    F. St-Hilaire

    2010-11-01

    Full Text Available We developed the McGill Wetland Model (MWM based on the general structure of the Peatland Carbon Simulator (PCARS and the Canadian Terrestrial Ecosystem Model. Three major changes were made to PCARS: (1 the light use efficiency model of photosynthesis was replaced with a biogeochemical description of photosynthesis; (2 the description of autotrophic respiration was changed to be consistent with the formulation of photosynthesis; and (3 the cohort, multilayer soil respiration model was changed to a simple one box peat decomposition model divided into an oxic and anoxic zones by an effective water table, and a one-year residence time litter pool. MWM was then evaluated by comparing its output to the estimates of net ecosystem production (NEP, gross primary production (GPP and ecosystem respiration (ER from 8 years of continuous measurements at the Mer Bleue peatland, a raised ombrotrophic bog located in southern Ontario, Canada (index of agreement [dimensionless]: NEP = 0.80, GPP = 0.97, ER = 0.97; systematic RMSE [g C m−2 d−1]: NEP = 0.12, GPP = 0.07, ER = 0.14; unsystematic RMSE: NEP = 0.15, GPP = 0.27, ER = 0.23. Simulated moss NPP approximates what would be expected for a bog peatland, but shrub NPP appears to be underestimated. Sensitivity analysis revealed that the model output did not change greatly due to variations in water table because of offsetting responses in production and respiration, but that even a modest temperature increase could lead to converting the bog from a sink to a source of CO2. General weaknesses and further developments of MWM are discussed.

  5. McSustainability and McJustice: Certification, Alternative Food and Agriculture, and Social Change

    Directory of Open Access Journals (Sweden)

    Maki Hatanaka

    2014-11-01

    Full Text Available Alternative food and agriculture movements increasingly rely on market-based approaches, particularly voluntary standards and certification, to advance environmental sustainability and social justice. Using a case study of an ecological shrimp project in Indonesia that became certified organic, this paper raises concerns regarding the impacts of certification on alternative food and agriculture movements, and their aims of furthering sustainability and justice. Drawing on George Ritzer’s McDonaldization framework, I argue that the ecological shrimp project became McDonaldized with the introduction of voluntary standards and certification. Specifically, efficiency, calculability, predictability, and control became key characteristics of the shrimp project. While the introduction of such characteristics increased market access, it also entailed significant costs, including an erosion of trust and marginalization and alienation of farmers. Given such tradeoffs, in concluding I propose that certification is producing particular forms of environmental sustainability and social justice, what I term McSustainability and McJustice. While enabling the expansion of alternative food and agriculture, McSustainability and McJustice tend to allow little opportunity for farmer empowerment and food sovereignty, as well as exclude aspects of sustainable farming or ethical production that are not easily measured, standardized, and validated.

  6. TH-A-18C-04: Ultrafast Cone-Beam CT Scatter Correction with GPU-Based Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Y [UT Southwestern Medical Center, Dallas, TX (United States); Southern Medical University, Guangzhou (China); Bai, T [UT Southwestern Medical Center, Dallas, TX (United States); Xi' an Jiaotong University, Xi' an (China); Yan, H; Ouyang, L; Wang, J; Pompos, A; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Zhou, L [Southern Medical University, Guangzhou (China)

    2014-06-15

    Purpose: Scatter artifacts severely degrade image quality of cone-beam CT (CBCT). We present an ultrafast scatter correction framework by using GPU-based Monte Carlo (MC) simulation and prior patient CT image, aiming at automatically finish the whole process including both scatter correction and reconstructions within 30 seconds. Methods: The method consists of six steps: 1) FDK reconstruction using raw projection data; 2) Rigid Registration of planning CT to the FDK results; 3) MC scatter calculation at sparse view angles using the planning CT; 4) Interpolation of the calculated scatter signals to other angles; 5) Removal of scatter from the raw projections; 6) FDK reconstruction using the scatter-corrected projections. In addition to using GPU to accelerate MC photon simulations, we also use a small number of photons and a down-sampled CT image in simulation to further reduce computation time. A novel denoising algorithm is used to eliminate MC scatter noise caused by low photon numbers. The method is validated on head-and-neck cases with simulated and clinical data. Results: We have studied impacts of photo histories, volume down sampling factors on the accuracy of scatter estimation. The Fourier analysis was conducted to show that scatter images calculated at 31 angles are sufficient to restore those at all angles with <0.1% error. For the simulated case with a resolution of 512×512×100, we simulated 10M photons per angle. The total computation time is 23.77 seconds on a Nvidia GTX Titan GPU. The scatter-induced shading/cupping artifacts are substantially reduced, and the average HU error of a region-of-interest is reduced from 75.9 to 19.0 HU. Similar results were found for a real patient case. Conclusion: A practical ultrafast MC-based CBCT scatter correction scheme is developed. The whole process of scatter correction and reconstruction is accomplished within 30 seconds. This study is supported in part by NIH (1R01CA154747-01), The Core Technology Research

  7. Advances in time series methods and applications the A. Ian McLeod festschrift

    CERN Document Server

    Stanford, David; Yu, Hao

    2016-01-01

    This volume reviews and summarizes some of A. I. McLeod's significant contributions to time series analysis. It also contains original contributions to the field and to related areas by participants of the festschrift held in June 2014 and friends of Dr. McLeod. Covering a diverse range of state-of-the-art topics, this volume well balances applied and theoretical research across fourteen contributions by experts in the field. It will be of interest to researchers and practitioners in time series, econometricians, and graduate students in time series or econometrics, as well as environmental statisticians, data scientists, statisticians interested in graphical models, and researchers in quantitative risk management.

  8. Assessing the health impact of transnational corporations: a case study on McDonald's Australia.

    Science.gov (United States)

    Anaf, Julia; Baum, Frances E; Fisher, Matt; Harris, Elizabeth; Friel, Sharon

    2017-02-06

    The practices of transnational corporations affect population health through production methods, shaping social determinants of health, or influencing the regulatory structures governing their activities. There has been limited research on community exposures to TNC policies and practices. Our pilot research used McDonald's Australia to test methods for assessing the health impacts of one TNC within Australia. We adapted existing Health Impact Assessment methods to assess McDonald's activities. Data identifying potential impacts were sourced through document analysis, including McDonald's corporate literature; media analysis and semi-structured interviews. We commissioned a spatial and socioeconomic analysis of McDonald's restaurants in Australia through Geographic Information System technology. The data was mapped against a corporate health impact assessment framework which included McDonald's Australia's political and business practices; products and marketing; workforce, social, environmental and economic conditions; and consumers' health related behaviours. We identified both positive and detrimental aspects of McDonald's Australian operations across the scope of the CHIA framework. We found that McDonald's outlets were slightly more likely to be located in areas of lower socioeconomic status. McDonald's workplace conditions were found to be more favourable than those in many other countries which reflects compliance with Australian employment regulations. The breadth of findings revealed the need for governments to strengthen regulatory mechanisms that are conducive to health; the opportunity for McDonald's to augment their corporate social responsibility initiatives and bolster reputational endorsement; and civil society actors to inform their advocacy towards health and equity outcomes from TNC operations. Our study indicates that undertaking a corporate health impact assessment is possible, with the different methods revealing sufficient information to

  9. Flat-histogram methods in quantum Monte Carlo simulations: Application to the t-J model

    International Nuclear Information System (INIS)

    Diamantis, Nikolaos G.; Manousakis, Efstratios

    2016-01-01

    We discuss that flat-histogram techniques can be appropriately applied in the sampling of quantum Monte Carlo simulation in order to improve the statistical quality of the results at long imaginary time or low excitation energy. Typical imaginary-time correlation functions calculated in quantum Monte Carlo are subject to exponentially growing errors as the range of imaginary time grows and this smears the information on the low energy excitations. We show that we can extract the low energy physics by modifying the Monte Carlo sampling technique to one in which configurations which contribute to making the histogram of certain quantities flat are promoted. We apply the diagrammatic Monte Carlo (diag-MC) method to the motion of a single hole in the t-J model and we show that the implementation of flat-histogram techniques allows us to calculate the Green's function in a wide range of imaginary-time. In addition, we show that applying the flat-histogram technique alleviates the “sign”-problem associated with the simulation of the single-hole Green's function at long imaginary time. (paper)

  10. Estimation of absorbed doses from paediatric cone-beam CT scans: MOSFET measurements and Monte Carlo simulations.

    Science.gov (United States)

    Kim, Sangroh; Yoshizumi, Terry T; Toncheva, Greta; Frush, Donald P; Yin, Fang-Fang

    2010-03-01

    The purpose of this study was to establish a dose estimation tool with Monte Carlo (MC) simulations. A 5-y-old paediatric anthropomorphic phantom was computed tomography (CT) scanned to create a voxelised phantom and used as an input for the abdominal cone-beam CT in a BEAMnrc/EGSnrc MC system. An X-ray tube model of the Varian On-Board Imager((R)) was built in the MC system. To validate the model, the absorbed doses at each organ location for standard-dose and low-dose modes were measured in the physical phantom with MOSFET detectors; effective doses were also calculated. In the results, the MC simulations were comparable to the MOSFET measurements. This voxelised phantom approach could produce a more accurate dose estimation than the stylised phantom method. This model can be easily applied to multi-detector CT dosimetry.

  11. McDonald's Recipe for Success

    Science.gov (United States)

    Weinstein, Margery

    2012-01-01

    Who isn't familiar with McDonald's? Its golden arches are among the most recognizable brand icons in the U.S. What many are less familiar with is the methodical and distinguished learning and development that supports that brand. Training that begins by preparing employees to serve customers at the counter, and extends to programs that help…

  12. Estimating misclassification error: a closer look at cross-validation based methods

    Directory of Open Access Journals (Sweden)

    Ounpraseuth Songthip

    2012-11-01

    Full Text Available Abstract Background To estimate a classifier’s error in predicting future observations, bootstrap methods have been proposed as reduced-variation alternatives to traditional cross-validation (CV methods based on sampling without replacement. Monte Carlo (MC simulation studies aimed at estimating the true misclassification error conditional on the training set are commonly used to compare CV methods. We conducted an MC simulation study to compare a new method of bootstrap CV (BCV to k-fold CV for estimating clasification error. Findings For the low-dimensional conditions simulated, the modest positive bias of k-fold CV contrasted sharply with the substantial negative bias of the new BCV method. This behavior was corroborated using a real-world dataset of prognostic gene-expression profiles in breast cancer patients. Our simulation results demonstrate some extreme characteristics of variance and bias that can occur due to a fault in the design of CV exercises aimed at estimating the true conditional error of a classifier, and that appear not to have been fully appreciated in previous studies. Although CV is a sound practice for estimating a classifier’s generalization error, using CV to estimate the fixed misclassification error of a trained classifier conditional on the training set is problematic. While MC simulation of this estimation exercise can correctly represent the average bias of a classifier, it will overstate the between-run variance of the bias. Conclusions We recommend k-fold CV over the new BCV method for estimating a classifier’s generalization error. The extreme negative bias of BCV is too high a price to pay for its reduced variance.

  13. MC++: A parallel, portable, Monte Carlo neutron transport code in C++

    International Nuclear Information System (INIS)

    Lee, S.R.; Cummings, J.C.; Nolen, S.D.

    1997-01-01

    MC++ is an implicit multi-group Monte Carlo neutron transport code written in C++ and based on the Parallel Object-Oriented Methods and Applications (POOMA) class library. MC++ runs in parallel on and is portable to a wide variety of platforms, including MPPs, SMPs, and clusters of UNIX workstations. MC++ is being developed to provide transport capabilities to the Accelerated Strategic Computing Initiative (ASCI). It is also intended to form the basis of the first transport physics framework (TPF), which is a C++ class library containing appropriate abstractions, objects, and methods for the particle transport problem. The transport problem is briefly described, as well as the current status and algorithms in MC++ for solving the transport equation. The alpha version of the POOMA class library is also discussed, along with the implementation of the transport solution algorithms using POOMA. Finally, a simple test problem is defined and performance and physics results from this problem are discussed on a variety of platforms

  14. User and programmers guide to the neutron ray-tracing package McStas, version 1.2

    DEFF Research Database (Denmark)

    Nielsen, K.; Lefmann, K.

    2000-01-01

    to handle more unusual needs. This report constitutes the reference manual for McStas, and contains full documentation for all ascpects of the program. Itcovers the various ways to compile and run simulations; a description of the metalanguage used to define simulations; a full description of all algorithms...

  15. Subcarrier Group Assignment for MC-CDMA Wireless Networks

    Directory of Open Access Journals (Sweden)

    Le-Ngoc Tho

    2007-01-01

    Full Text Available Two interference-based subcarrier group assignment strategies in dynamic resource allocation are proposed for MC-CDMA wireless systems to achieve high throughput in a multicell environment. Least interfered group assignment (LIGA selects for each session the subcarrier group on which the user receives the minimum interference, while best channel ratio group assignment (BCRGA chooses the subcarrier group with the largest channel response-to-interference ratio. Both analytical framework and simulation model are developed for evaluation of throughput distribution of the proposed schemes. An iterative approach is devised to handle the complex interdependency between multicell interference profiles in the throughput analysis. Illustrative results show significant throughput improvement offered by the interference-based assignment schemes for MC-CDMA multicell wireless systems. In particular, under low loading conditions, LIGA renders the best performance. However, as the load increases BCRGA tends to offer superior performance.

  16. Subcarrier Group Assignment for MC-CDMA Wireless Networks

    Directory of Open Access Journals (Sweden)

    Tho Le-Ngoc

    2007-12-01

    Full Text Available Two interference-based subcarrier group assignment strategies in dynamic resource allocation are proposed for MC-CDMA wireless systems to achieve high throughput in a multicell environment. Least interfered group assignment (LIGA selects for each session the subcarrier group on which the user receives the minimum interference, while best channel ratio group assignment (BCRGA chooses the subcarrier group with the largest channel response-to-interference ratio. Both analytical framework and simulation model are developed for evaluation of throughput distribution of the proposed schemes. An iterative approach is devised to handle the complex interdependency between multicell interference profiles in the throughput analysis. Illustrative results show significant throughput improvement offered by the interference-based assignment schemes for MC-CDMA multicell wireless systems. In particular, under low loading conditions, LIGA renders the best performance. However, as the load increases BCRGA tends to offer superior performance.

  17. Influence of periodontal ligament simulation on bond strength and fracture resistance of roots restored with fiber posts

    Directory of Open Access Journals (Sweden)

    Ana Maria Estivalete MARCHIONATTI

    2014-10-01

    Full Text Available Objective: Considering that periodontal ligament simulation may influence the stress distribution over teeth restored with intraradicular retainers, this study aimed to assess the combined effect of mechanical cycling and periodontal ligament simulation on both the bond strength between fiber posts and root dentin and the fracture resistance of teeth restored using glass fiber posts. Material and Methods: Ninety roots were randomly distributed into 3 groups (n=10 (C-MC: control; P-MC: polyether; AS-MC: addition silicone to test bond strength and 6 groups (n=10 (C: control; P: polyether; AS: addition silicone, without mechanical cycling, and C-MC, P-MC and AS-MC with mechanical cycling to test fracture strength, according to the material used to simulate the periodontal ligament. For the bond strength test, fiber posts were cemented, cores were built, mechanical cycling was applied (2×106 cycles, 88 N, 2.2 Hz, and 45º incline, and the teeth cut into 3 slices (2 mm, which were then subjected to the push-out test at 1 mm/min. For the fracture strength test, fiber posts were cemented, cores were built, and half of the groups received mechanical cycling, followed by the compressive strength (45° to the long axis and 1 mm/min performed on all groups. Results: Periodontal ligament simulation did not affect the bond strength (p=0.244 between post and dentin. Simulation of periodontal ligament (p=0.153 and application of mechanical cycling (p=0.97 did not affect fracture resistance. Conclusions: The materials used to simulate the periodontal ligament did not affect fracture or bond strength, therefore periodontal ligament simulation using the tested materials could be considered optional in the conditions of the study.

  18. Electric Vehicle Performance at McMurdo Station (Antarctica) and Comparison with McMurdo Station Conventional Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Sears, T.; Lammert, M.; Colby, K.; Walter, R.

    2014-09-01

    This report examines the performance of two electric vehicles (EVs) at McMurdo, Antarctica (McMurdo). The study examined the performance of two e-ride Industries EVs initially delivered to McMurdo on February 16, 2011, and compared their performance and fuel use with that of conventional vehicles that have a duty cycle similar to that of the EVs used at McMurdo.

  19. Molecular dynamics simulation for PBR pebble tracking simulation via a random walk approach using Monte Carlo simulation.

    Science.gov (United States)

    Lee, Kyoung O; Holmes, Thomas W; Calderon, Adan F; Gardner, Robin P

    2012-05-01

    Using a Monte Carlo (MC) simulation, random walks were used for pebble tracking in a two-dimensional geometry in the presence of a biased gravity field. We investigated the effect of viscosity damping in the presence of random Gaussian fluctuations. The particle tracks were generated by Molecular Dynamics (MD) simulation for a Pebble Bed Reactor. The MD simulations were conducted in the interaction of noncohesive Hertz-Mindlin theory where the random walk MC simulation has a correlation with the MD simulation. This treatment can easily be extended to include the generation of transient gamma-ray spectra from a single pebble that contains a radioactive tracer. Then the inverse analysis thereof could be made to determine the uncertainty of the realistic measurement of transient positions of that pebble by any given radiation detection system designed for that purpose. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Alex McQueen : power

    Index Scriptorium Estoniae

    1998-01-01

    A. McQueeni moevälisest tegevusest. 'American Express' tellis temalt krediitkaardi kujunduse. 1998. a. suvest ajakirja 'Dazed & Confused' abitoimetaja. A. McQueen on lubanud olla Björki (Island) video kunstiline juht.

  1. Dynamic PET simulator via tomographic emission projection for kinetic modeling and parametric image studies.

    Science.gov (United States)

    Häggström, Ida; Beattie, Bradley J; Schmidtlein, C Ross

    2016-06-01

    To develop and evaluate a fast and simple tool called dpetstep (Dynamic PET Simulator of Tracers via Emission Projection), for dynamic PET simulations as an alternative to Monte Carlo (MC), useful for educational purposes and evaluation of the effects of the clinical environment, postprocessing choices, etc., on dynamic and parametric images. The tool was developed in matlab using both new and previously reported modules of petstep (PET Simulator of Tracers via Emission Projection). Time activity curves are generated for each voxel of the input parametric image, whereby effects of imaging system blurring, counting noise, scatters, randoms, and attenuation are simulated for each frame. Each frame is then reconstructed into images according to the user specified method, settings, and corrections. Reconstructed images were compared to MC data, and simple Gaussian noised time activity curves (GAUSS). dpetstep was 8000 times faster than MC. Dynamic images from dpetstep had a root mean square error that was within 4% on average of that of MC images, whereas the GAUSS images were within 11%. The average bias in dpetstep and MC images was the same, while GAUSS differed by 3% points. Noise profiles in dpetstep images conformed well to MC images, confirmed visually by scatter plot histograms, and statistically by tumor region of interest histogram comparisons that showed no significant differences (p dynamic PET and parametric images, and demonstrated that it generates both images and subsequent parametric images with very similar noise properties to those of MC images, in a fraction of the time. They believe dpetstep to be very useful for generating fast, simple, and realistic results, however since it uses simple scatter and random models it may not be suitable for studies investigating these phenomena. dpetstep can be downloaded free of cost from https://github.com/CRossSchmidtlein/dPETSTEP.

  2. Multilevel and Multi-index Monte Carlo methods for the McKean–Vlasov equation

    KAUST Repository

    Haji-Ali, Abdul-Lateef

    2017-09-12

    We address the approximation of functionals depending on a system of particles, described by stochastic differential equations (SDEs), in the mean-field limit when the number of particles approaches infinity. This problem is equivalent to estimating the weak solution of the limiting McKean–Vlasov SDE. To that end, our approach uses systems with finite numbers of particles and a time-stepping scheme. In this case, there are two discretization parameters: the number of time steps and the number of particles. Based on these two parameters, we consider different variants of the Monte Carlo and Multilevel Monte Carlo (MLMC) methods and show that, in the best case, the optimal work complexity of MLMC, to estimate the functional in one typical setting with an error tolerance of $$\\\\mathrm {TOL}$$TOL, is when using the partitioning estimator and the Milstein time-stepping scheme. We also consider a method that uses the recent Multi-index Monte Carlo method and show an improved work complexity in the same typical setting of . Our numerical experiments are carried out on the so-called Kuramoto model, a system of coupled oscillators.

  3. Analytical model of the binary multileaf collimator of tomotherapy for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Sterpin, E; Vynckier, S; Salvat, F; Olivera, G H

    2008-01-01

    Helical Tomotherapy (HT) delivers intensity-modulated radiotherapy by the means of many configurations of the binary multi-leaf collimator (MLC). The aim of the present study was to devise a method, which we call the 'transfer function' (TF) method, to perform the transport of particles through the MLC much faster than the time consuming Monte Carlo (MC) simulation and with no significant loss of accuracy. The TF method consists of calculating, for each photon in the phase-space file, the attenuation factor for each leaf (up to three) that the photon passes, assuming straight propagation through closed leaves, and storing these factors in a modified phase-space file. To account for the transport through the MLC in a given configuration, the weight of a photon is simply multiplied by the attenuation factors of the leaves that are intersected by the photon ray and are closed. The TF method was combined with the PENELOPE MC code, and validated with measurements for the three static field sizes available (40x5, 40x2.5 and 40x1 cm 2 ) and for some MLC patterns. The TF method allows a large reduction in computation time, without introducing appreciable deviations from the result of full MC simulations

  4. MC-PDFT can calculate singlet-triplet splittings of organic diradicals

    Science.gov (United States)

    Stoneburner, Samuel J.; Truhlar, Donald G.; Gagliardi, Laura

    2018-02-01

    The singlet-triplet splittings of a set of diradical organic molecules are calculated using multiconfiguration pair-density functional theory (MC-PDFT), and the results are compared with those obtained by Kohn-Sham density functional theory (KS-DFT) and complete active space second-order perturbation theory (CASPT2) calculations. We found that MC-PDFT, even with small and systematically defined active spaces, is competitive in accuracy with CASPT2, and it yields results with greater accuracy and precision than Kohn-Sham DFT with the parent functional. MC-PDFT also avoids the challenges associated with spin contamination in KS-DFT. It is also shown that MC-PDFT is much less computationally expensive than CASPT2 when applied to larger active spaces, and this illustrates the promise of this method for larger diradical organic systems.

  5. Simulation of substrate degradation in composting of sewage sludge

    International Nuclear Information System (INIS)

    Zhang Jun; Gao Ding; Chen Tongbin; Zheng Guodi; Chen Jun; Ma Chuang; Guo Songlin; Du Wei

    2010-01-01

    To simulate the substrate degradation kinetics of the composting process, this paper develops a mathematical model with a first-order reaction assumption and heat/mass balance equations. A pilot-scale composting test with a mixture of sewage sludge and wheat straw was conducted in an insulated reactor. The BVS (biodegradable volatile solids) degradation process, matrix mass, MC (moisture content), DM (dry matter) and VS (volatile solid) were simulated numerically by the model and experimental data. The numerical simulation offered a method for simulating k (the first-order rate constant) and estimating k 20 (the first-order rate constant at 20 o C). After comparison with experimental values, the relative error of the simulation value of the mass of the compost at maturity was 0.22%, MC 2.9%, DM 4.9% and VS 5.2%, which mean that the simulation is a good fit. The k of sewage sludge was simulated, and k 20 , k 20s (first-order rate coefficient of slow fraction of BVS at 20 o C) of the sewage sludge were estimated as 0.082 and 0.015 d -1 , respectively.

  6. McUniversities Revisited: A Comparison of University and McDonald's Casual Employee Experiences in Australia

    Science.gov (United States)

    Nadolny, Andrew; Ryan, Suzanne

    2015-01-01

    The McDonaldization of higher education refers to the transformation of universities from knowledge generators to rational service organizations or "McUniversities". This is reflected in the growing dependence on a casualized academic workforce. The article explores the extent to which the McDonaldization thesis applies to universities…

  7. Simulation of hydrogen bubble growth in tungsten by a hybrid model

    International Nuclear Information System (INIS)

    Sang, Chaofeng; Sun, Jizhong; Bonnin, Xavier; Wang, L.; Wang, Dezhen

    2015-01-01

    A two dimensional hybrid code (HIIPC-MC) joining rate-theory and Monte Carlo (MC) methods is developed in this work. We evaluate the cascade-coalescence mechanism contribution to the bubble growth by MC. First, effects of the starting radius and solute deuterium concentration on the bubble growth are studied; then the impacts of the wall temperature and implantation ion flux on the bubble growth are assessed. The simulation indicates that the migration-coalescence of the bubbles and the high pressure inside the bubbles are the main driving forces for the bubble growth, and that neglect of the migration and coalescence would lead to an underestimation of the bubble growth or blistering

  8. Performance Analysis of HF Band FB-MC-SS

    Energy Technology Data Exchange (ETDEWEB)

    Hussein Moradi; Stephen Andrew Laraway; Behrouz Farhang-Boroujeny

    2016-01-01

    Abstract—In a recent paper [1] the filter bank multicarrier spread spectrum (FB-MC-SS) waveform was proposed for wideband spread spectrum HF communications. A significant benefit of this waveform is robustness against narrow and partial band interference. Simulation results in [1] demonstrated good performance in a wideband HF channel over a wide range of conditions. In this paper we present a theoretical analysis of the bit error probably for this system. Our analysis tailors the results from [2] where BER performance was analyzed for maximum ration combining systems that accounted for correlation between subcarriers and channel estimation error. Equations are give for BER that closely match the simulated performance in most situations.

  9. GEM simulation methods development

    International Nuclear Information System (INIS)

    Tikhonov, V.; Veenhof, R.

    2002-01-01

    A review of methods used in the simulation of processes in gas electron multipliers (GEMs) and in the accurate calculation of detector characteristics is presented. Such detector characteristics as effective gas gain, transparency, charge collection and losses have been calculated and optimized for a number of GEM geometries and compared with experiment. A method and a new special program for calculations of detector macro-characteristics such as signal response in a real detector readout structure, and spatial and time resolution of detectors have been developed and used for detector optimization. A detailed development of signal induction on readout electrodes and electronics characteristics are included in the new program. A method for the simulation of charging-up effects in GEM detectors is described. All methods show good agreement with experiment

  10. Isolation of chlamydia in irradiated and non-irradiated McCoy cells

    International Nuclear Information System (INIS)

    Johnson, L.; Harper, I.A.

    1975-01-01

    Specimens from eye and genital tract were cultured in parallel in irradiated and non-irradiated McCoy cells and the frequency of isolation of chlamydia using these culture methods was compared. There was a significant difference between the frequencies of isolation; irradiated McCoy cells produced a greater number of positive results. (author)

  11. Groundwater availability in the Crouch Branch and McQueen Branch aquifers, Chesterfield County, South Carolina, 1900-2012

    Science.gov (United States)

    Campbell, Bruce G.; Landmeyer, James E.

    2014-01-01

    Chesterfield County is located in the northeastern part of South Carolina along the southern border of North Carolina and is primarily underlain by unconsolidated sediments of Late Cretaceous age and younger of the Atlantic Coastal Plain. Approximately 20 percent of Chesterfield County is in the Piedmont Physiographic Province, and this area of the county is not included in this study. These Atlantic Coastal Plain sediments compose two productive aquifers: the Crouch Branch aquifer that is present at land surface across most of the county and the deeper, semi-confined McQueen Branch aquifer. Most of the potable water supplied to residents of Chesterfield County is produced from the Crouch Branch and McQueen Branch aquifers by a well field located near McBee, South Carolina, in the southwestern part of the county. Overall, groundwater availability is good to very good in most of Chesterfield County, especially the area around and to the south of McBee, South Carolina. The eastern part of Chesterfield County does not have as abundant groundwater resources but resources are generally adequate for domestic purposes. The primary purpose of this study was to determine groundwater-flow rates, flow directions, and changes in water budgets over time for the Crouch Branch and McQueen Branch aquifers in the Chesterfield County area. This goal was accomplished by using the U.S. Geological Survey finite-difference MODFLOW groundwater-flow code to construct and calibrate a groundwater-flow model of the Atlantic Coastal Plain of Chesterfield County. The model was created with a uniform grid size of 300 by 300 feet to facilitate a more accurate simulation of groundwater-surface-water interactions. The model consists of 617 rows from north to south extending about 35 miles and 884 columns from west to east extending about 50 miles, yielding a total area of about 1,750 square miles. However, the active part of the modeled area, or the part where groundwater flow is simulated

  12. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  13. Detector Simulation: Data Treatment and Analysis Methods

    CERN Document Server

    Apostolakis, J

    2011-01-01

    Detector Simulation in 'Data Treatment and Analysis Methods', part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B1: Detectors for Particles and Radiation. Part 1: Principles and Methods'. This document is part of Part 1 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Section '4.1 Detector Simulation' of Chapter '4 Data Treatment and Analysis Methods' with the content: 4.1 Detector Simulation 4.1.1 Overview of simulation 4.1.1.1 Uses of detector simulation 4.1.2 Stages and types of simulation 4.1.2.1 Tools for event generation and detector simulation 4.1.2.2 Level of simulation and computation time 4.1.2.3 Radiation effects and background studies 4.1.3 Components of detector simulation 4.1.3.1 Geometry modeling 4.1.3.2 External fields 4.1.3.3 Intro...

  14. Monte Carlo simulation for the transport beamline

    Energy Technology Data Exchange (ETDEWEB)

    Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania (Italy); Attili, A.; Marchetto, F.; Russo, G. [INFN, Sezione di Torino, Via P.Giuria, 1 10125 Torino (Italy); Cirrone, G. A. P.; Schillaci, F.; Scuderi, V. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Institute of Physics Czech Academy of Science, ELI-Beamlines project, Na Slovance 2, Prague (Czech Republic); Carpinelli, M. [INFN Sezione di Cagliari, c/o Dipartimento di Fisica, Università di Cagliari, Cagliari (Italy); Tramontana, A. [INFN, Laboratori Nazionali del Sud, Via Santa Sofia 62, Catania, Italy and Università di Catania, Dipartimento di Fisica e Astronomia, Via S. Sofia 64, Catania (Italy)

    2013-07-26

    In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery.

  15. Monte Carlo simulation for the transport beamline

    International Nuclear Information System (INIS)

    Romano, F.; Cuttone, G.; Jia, S. B.; Varisano, A.; Attili, A.; Marchetto, F.; Russo, G.; Cirrone, G. A. P.; Schillaci, F.; Scuderi, V.; Carpinelli, M.; Tramontana, A.

    2013-01-01

    In the framework of the ELIMED project, Monte Carlo (MC) simulations are widely used to study the physical transport of charged particles generated by laser-target interactions and to preliminarily evaluate fluence and dose distributions. An energy selection system and the experimental setup for the TARANIS laser facility in Belfast (UK) have been already simulated with the GEANT4 (GEometry ANd Tracking) MC toolkit. Preliminary results are reported here. Future developments are planned to implement a MC based 3D treatment planning in order to optimize shots number and dose delivery

  16. Accelerated prompt gamma estimation for clinical proton therapy simulations

    Science.gov (United States)

    Huisman, Brent F. B.; Létang, J. M.; Testa, É.; Sarrut, D.

    2016-11-01

    There is interest in the particle therapy community in using prompt gammas (PGs), a natural byproduct of particle treatment, for range verification and eventually dose control. However, PG production is a rare process and therefore estimation of PGs exiting a patient during a proton treatment plan executed by a Monte Carlo (MC) simulation converges slowly. Recently, different approaches to accelerating the estimation of PG yield have been presented. Sterpin et al (2015 Phys. Med. Biol. 60 4915-46) described a fast analytic method, which is still sensitive to heterogeneities. El Kanawati et al (2015 Phys. Med. Biol. 60 8067-86) described a variance reduction method (pgTLE) that accelerates the PG estimation by precomputing PG production probabilities as a function of energy and target materials, but has as a drawback that the proposed method is limited to analytical phantoms. We present a two-stage variance reduction method, named voxelized pgTLE (vpgTLE), that extends pgTLE to voxelized volumes. As a preliminary step, PG production probabilities are precomputed once and stored in a database. In stage 1, we simulate the interactions between the treatment plan and the patient CT with low statistic MC to obtain the spatial and spectral distribution of the PGs. As primary particles are propagated throughout the patient CT, the PG yields are computed in each voxel from the initial database, as a function of the current energy of the primary, the material in the voxel and the step length. The result is a voxelized image of PG yield, normalized to a single primary. The second stage uses this intermediate PG image as a source to generate and propagate the number of PGs throughout the rest of the scene geometry, e.g. into a detection device, corresponding to the number of primaries desired. We achieved a gain of around 103 for both a geometrical heterogeneous phantom and a complete patient CT treatment plan with respect to analog MC, at a convergence level of 2% relative

  17. Human reliability-based MC and A models for detecting insider theft

    International Nuclear Information System (INIS)

    Duran, Felicia Angelica; Wyss, Gregory Dane

    2010-01-01

    Material control and accounting (MC and A) safeguards operations that track and account for critical assets at nuclear facilities provide a key protection approach for defeating insider adversaries. These activities, however, have been difficult to characterize in ways that are compatible with the probabilistic path analysis methods that are used to systematically evaluate the effectiveness of a site's physical protection (security) system (PPS). MC and A activities have many similar characteristics to operator procedures performed in a nuclear power plant (NPP) to check for anomalous conditions. This work applies human reliability analysis (HRA) methods and models for human performance of NPP operations to develop detection probabilities for MC and A activities. This has enabled the development of an extended probabilistic path analysis methodology in which MC and A protections can be combined with traditional sensor data in the calculation of PPS effectiveness. The extended path analysis methodology provides an integrated evaluation of a safeguards and security system that addresses its effectiveness for attacks by both outside and inside adversaries.

  18. 75 FR 27286 - McKelvie Geographic Area Range Allotment Management Planning on the Samuel R. McKelvie National...

    Science.gov (United States)

    2010-05-14

    ... range allotment management planning on the McKelvie Geographic Area, Samuel R. McKelvie National Forest... DEPARTMENT OF AGRICULTURE Forest Service McKelvie Geographic Area Range Allotment Management Planning on the Samuel R. McKelvie National Forest, Bessey Ranger District in Nebraska AGENCY: Forest...

  19. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    International Nuclear Information System (INIS)

    Thing, Rune S.; Bernchou, Uffe; Brink, Carsten; Mainegra-Hing, Ernesto

    2013-01-01

    Purpose: Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from being fully implemented in a clinical setting. This study investigates the combination of using fast MC simulations to predict scatter distributions with a ray tracing algorithm to allow calibration between simulated and clinical CBCT images. Material and methods: An EGSnrc-based user code (egs c bct), was used to perform MC simulations of an Elekta XVI CBCT imaging system. A 60keV x-ray source was used, and air kerma scored at the detector plane. Several variance reduction techniques (VRTs) were used to increase the scatter calculation efficiency. Three patient phantoms based on CT scans were simulated, namely a brain, a thorax and a pelvis scan. A ray tracing algorithm was used to calculate the detector signal due to primary photons. A total of 288 projections were simulated, one for each thread on the computer cluster used for the investigation. Results: Scatter distributions for the brain, thorax and pelvis scan were simulated within 2 % statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per patient, using a full simulation of the clinical CBCT geometry. Conclusions: This study shows that use of MC-based scatter corrections in CBCT imaging has a great potential to improve CBCT image quality. By use of powerful VRTs to predict scatter distributions and a ray tracing algorithm to calculate the primary signal, it is possible to obtain the necessary data for patient specific MC scatter correction within two hours per patient

  20. Direct Monte Carlo Simulation Methods for Nonreacting and Reacting Systems at Fixed Total Internal Energy or Enthalpy

    Czech Academy of Sciences Publication Activity Database

    Smith, W.; Lísal, Martin

    2002-01-01

    Roč. 66, č. 1 (2002), s. 011104-1 - 011104-1 ISSN 1063-651X R&D Projects: GA ČR GA203/02/0805 Grant - others:NSERC(CA) OGP1041 Keywords : MC * simulation * reaction Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.397, year: 2002

  1. Optimisation of 12 MeV electron beam simulation using variance reduction technique

    International Nuclear Information System (INIS)

    Jayamani, J; Aziz, M Z Abdul; Termizi, N A S Mohd; Kamarulzaman, F N Mohd

    2017-01-01

    Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 10 7 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 10 7 to 20 × 10 7 . In this study, 5 MeV electron cut-off with 10 × 10 7 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy. (paper)

  2. Automatic insertion of simulated microcalcification clusters in a software breast phantom

    Science.gov (United States)

    Shankla, Varsha; Pokrajac, David D.; Weinstein, Susan P.; DeLeo, Michael; Tuite, Catherine; Roth, Robyn; Conant, Emily F.; Maidment, Andrew D.; Bakic, Predrag R.

    2014-03-01

    An automated method has been developed to insert realistic clusters of simulated microcalcifications (MCs) into computer models of breast anatomy. This algorithm has been developed as part of a virtual clinical trial (VCT) software pipeline, which includes the simulation of breast anatomy, mechanical compression, image acquisition, image processing, display and interpretation. An automated insertion method has value in VCTs involving large numbers of images. The insertion method was designed to support various insertion placement strategies, governed by probability distribution functions (pdf). The pdf can be predicated on histological or biological models of tumor growth, or estimated from the locations of actual calcification clusters. To validate the automated insertion method, a 2-AFC observer study was designed to compare two placement strategies, undirected and directed. The undirected strategy could place a MC cluster anywhere within the phantom volume. The directed strategy placed MC clusters within fibroglandular tissue on the assumption that calcifications originate from epithelial breast tissue. Three radiologists were asked to select between two simulated phantom images, one from each placement strategy. Furthermore, questions were posed to probe the rationale behind the observer's selection. The radiologists found the resulting cluster placement to be realistic in 92% of cases, validating the automated insertion method. There was a significant preference for the cluster to be positioned on a background of adipose or mixed adipose/fibroglandular tissues. Based upon these results, this automated lesion placement method will be included in our VCT simulation pipeline.

  3. Efficient hybrid non-equilibrium molecular dynamics--Monte Carlo simulations with symmetric momentum reversal.

    Science.gov (United States)

    Chen, Yunjie; Roux, Benoît

    2014-09-21

    Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct

  4. Efficient hybrid non-equilibrium molecular dynamics - Monte Carlo simulations with symmetric momentum reversal

    Science.gov (United States)

    Chen, Yunjie; Roux, Benoît

    2014-09-01

    Hybrid schemes combining the strength of molecular dynamics (MD) and Metropolis Monte Carlo (MC) offer a promising avenue to improve the sampling efficiency of computer simulations of complex systems. A number of recently proposed hybrid methods consider new configurations generated by driving the system via a non-equilibrium MD (neMD) trajectory, which are subsequently treated as putative candidates for Metropolis MC acceptance or rejection. To obey microscopic detailed balance, it is necessary to alter the momentum of the system at the beginning and/or the end of the neMD trajectory. This strict rule then guarantees that the random walk in configurational space generated by such hybrid neMD-MC algorithm will yield the proper equilibrium Boltzmann distribution. While a number of different constructs are possible, the most commonly used prescription has been to simply reverse the momenta of all the particles at the end of the neMD trajectory ("one-end momentum reversal"). Surprisingly, it is shown here that the choice of momentum reversal prescription can have a considerable effect on the rate of convergence of the hybrid neMD-MC algorithm, with the simple one-end momentum reversal encountering particularly acute problems. In these neMD-MC simulations, different regions of configurational space end up being essentially isolated from one another due to a very small transition rate between regions. In the worst-case scenario, it is almost as if the configurational space does not constitute a single communicating class that can be sampled efficiently by the algorithm, and extremely long neMD-MC simulations are needed to obtain proper equilibrium probability distributions. To address this issue, a novel momentum reversal prescription, symmetrized with respect to both the beginning and the end of the neMD trajectory ("symmetric two-ends momentum reversal"), is introduced. Illustrative simulations demonstrate that the hybrid neMD-MC algorithm robustly yields a correct

  5. Fast Monte Carlo-simulator with full collimator and detector response modelling for SPECT

    International Nuclear Information System (INIS)

    Sohlberg, A.O.; Kajaste, M.T.

    2012-01-01

    Monte Carlo (MC)-simulations have proved to be a valuable tool in studying single photon emission computed tomography (SPECT)-reconstruction algorithms. Despite their popularity, the use of Monte Carlo-simulations is still often limited by their large computation demand. This is especially true in situations where full collimator and detector modelling with septal penetration, scatter and X-ray fluorescence needs to be included. This paper presents a rapid and simple MC-simulator, which can effectively reduce the computation times. The simulator was built on the convolution-based forced detection principle, which can markedly lower the number of simulated photons. Full collimator and detector response look-up tables are pre-simulated and then later used in the actual MC-simulations to model the system response. The developed simulator was validated by comparing it against 123 I point source measurements made with a clinical gamma camera system and against 99m Tc software phantom simulations made with the SIMIND MC-package. The results showed good agreement between the new simulator, measurements and the SIMIND-package. The new simulator provided near noise-free projection data in approximately 1.5 min per projection with 99m Tc, which was less than one-tenth of SIMIND's time. The developed MC-simulator can markedly decrease the simulation time without sacrificing image quality. (author)

  6. Corporate communication or McCommunication? Considering a McDonaldization of corporate communication hypothesis

    NARCIS (Netherlands)

    Verhoeven, P.

    2015-01-01

    In this essay the perspective of Ritzer's McDonaldization of Society Thesis is the starting point for developing hypotheses about corporate communication (CorpCom). The central idea of McDonaldization is that increasing numbers of organizations are run as fast food restaurants, focusing on:

  7. Estimation of computed tomography dose index in cone beam computed tomography: MOSFET measurements and Monte Carlo simulations.

    Science.gov (United States)

    Kim, Sangroh; Yoshizumi, Terry; Toncheva, Greta; Yoo, Sua; Yin, Fang-Fang; Frush, Donald

    2010-05-01

    To address the lack of accurate dose estimation method in cone beam computed tomography (CBCT), we performed point dose metal oxide semiconductor field-effect transistor (MOSFET) measurements and Monte Carlo (MC) simulations. A Varian On-Board Imager (OBI) was employed to measure point doses in the polymethyl methacrylate (PMMA) CT phantoms with MOSFETs for standard and low dose modes. A MC model of the OBI x-ray tube was developed using BEAMnrc/EGSnrc MC system and validated by the half value layer, x-ray spectrum and lateral and depth dose profiles. We compared the weighted computed tomography dose index (CTDIw) between MOSFET measurements and MC simulations. The CTDIw was found to be 8.39 cGy for the head scan and 4.58 cGy for the body scan from the MOSFET measurements in standard dose mode, and 1.89 cGy for the head and 1.11 cGy for the body in low dose mode, respectively. The CTDIw from MC compared well to the MOSFET measurements within 5% differences. In conclusion, a MC model for Varian CBCT has been established and this approach may be easily extended from the CBCT geometry to multi-detector CT geometry.

  8. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    International Nuclear Information System (INIS)

    Setiani, Tia Dwi; Suprijadi; Haryanto, Freddy

    2016-01-01

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10"8 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  9. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Suprijadi [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Haryanto, Freddy [Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia)

    2016-03-11

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  10. Michel Trottier-McDonald

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics. Michel Trottier-McDonald. Articles written in Pramana – Journal of Physics. Volume 79 Issue 5 November 2012 pp 1337-1340 Poster Presentations. Tau reconstruction, energy calibration and identification at ATLAS · Michel Trottier-McDonald on behalf of the ATLAS ...

  11. New methods in plasma simulation

    International Nuclear Information System (INIS)

    Mason, R.J.

    1990-01-01

    The development of implicit methods of particle-in-cell (PIC) computer simulation in recent years, and their merger with older hybrid methods have created a new arsenal of simulation techniques for the treatment of complex practical problems in plasma physics. The new implicit hybrid codes are aimed at transitional problems that lie somewhere between the long time scale, high density regime associated with MHD modeling, and the short time scale, low density regime appropriate to PIC particle-in-cell techniques. This transitional regime arises in ICF coronal plasmas, in pulsed power plasma switches, in Z-pinches, and in foil implosions. Here, we outline how such a merger of implicit and hybrid methods has been carried out, specifically in the ANTHEM computer code, and demonstrate the utility of implicit hybrid simulation in applications. 25 refs., 5 figs

  12. Nancy McCormick Rambusch: A Reflection

    Science.gov (United States)

    Povell, Phyllis

    2005-01-01

    Fall of 2005 marks the 12th anniversary of Nancy McCormick Rambusch's death. As the founder of the American Montessori Society and as its first president, Rambusch reintroduced Maria Montessori to America at a time--1960--when education for the young was floundering, and a second look at the Montessori method, which had changed the early childhood…

  13. A Fast Soft Bit Error Rate Estimation Method

    Directory of Open Access Journals (Sweden)

    Ait-Idir Tarik

    2010-01-01

    Full Text Available We have suggested in a previous publication a method to estimate the Bit Error Rate (BER of a digital communications system instead of using the famous Monte Carlo (MC simulation. This method was based on the estimation of the probability density function (pdf of soft observed samples. The kernel method was used for the pdf estimation. In this paper, we suggest to use a Gaussian Mixture (GM model. The Expectation Maximisation algorithm is used to estimate the parameters of this mixture. The optimal number of Gaussians is computed by using Mutual Information Theory. The analytical expression of the BER is therefore simply given by using the different estimated parameters of the Gaussian Mixture. Simulation results are presented to compare the three mentioned methods: Monte Carlo, Kernel and Gaussian Mixture. We analyze the performance of the proposed BER estimator in the framework of a multiuser code division multiple access system and show that attractive performance is achieved compared with conventional MC or Kernel aided techniques. The results show that the GM method can drastically reduce the needed number of samples to estimate the BER in order to reduce the required simulation run-time, even at very low BER.

  14. MC21/CTF and VERA multiphysics solutions to VERA core physics benchmark progression problems 6 and 7

    Directory of Open Access Journals (Sweden)

    Daniel J. Kelly, III

    2017-09-01

    Full Text Available The continuous energy Monte Carlo neutron transport code, MC21, was coupled to the CTF subchannel thermal-hydraulics code using a combination of Consortium for Advanced Simulation of Light Water Reactors (CASL tools and in-house Python scripts. An MC21/CTF solution for VERA Core Physics Benchmark Progression Problem 6 demonstrated good agreement with MC21/COBRA-IE and VERA solutions. The MC21/CTF solution for VERA Core Physics Benchmark Progression Problem 7, Watts Bar Unit 1 at beginning of cycle hot full power equilibrium xenon conditions, is the first published coupled Monte Carlo neutronics/subchannel T-H solution for this problem. MC21/CTF predicted a critical boron concentration of 854.5 ppm, yielding a critical eigenvalue of 0.99994 ± 6.8E-6 (95% confidence interval. Excellent agreement with a VERA solution of Problem 7 was also demonstrated for integral and local power and temperature parameters.

  15. Monte Carlo simulation of a medical linear accelerator for generation of phase spaces

    International Nuclear Information System (INIS)

    Oliveira, Alex C.H.; Santana, Marcelo G.; Lima, Fernando R.A.; Vieira, Jose W.

    2013-01-01

    Radiotherapy uses various techniques and equipment for local treatment of cancer. The equipment most often used in radiotherapy to the patient irradiation are linear accelerators (Linacs) which produce beams of X-rays in the range 5-30 MeV. Among the many algorithms developed over recent years for evaluation of dose distributions in radiotherapy planning, the algorithms based on Monte Carlo methods have proven to be very promising in terms of accuracy by providing more realistic results. The MC methods allow simulating the transport of ionizing radiation in complex configurations, such as detectors, Linacs, phantoms, etc. The MC simulations for applications in radiotherapy are divided into two parts. In the first, the simulation of the production of the radiation beam by the Linac is performed and then the phase space is generated. The phase space contains information such as energy, position, direction, etc. og millions of particles (photos, electrons, positrons). In the second part the simulation of the transport of particles (sampled phase space) in certain configurations of irradiation field is performed to assess the dose distribution in the patient (or phantom). The objective of this work is to create a computational model of a 6 MeV Linac using the MC code Geant4 for generation of phase spaces. From the phase space, information was obtained to asses beam quality (photon and electron spectra and two-dimensional distribution of energy) and analyze the physical processes involved in producing the beam. (author)

  16. Advanced sources and optical components for the McStas neutron scattering instrument simulation package

    DEFF Research Database (Denmark)

    Farhi, E.; Monzat, C.; Arnerin, R.

    2014-01-01

    -up, including lenses and prisms. A new library for McStas adds the ability to describe any geometrical arrangement as a set of polygons. This feature has been implemented in most sample scattering components such as Single_crystal, Incoherent, Isotropic_Sqw (liquids/amorphous/powder), PowderN as well...

  17. MC EMiNEM maps the interaction landscape of the Mediator.

    Directory of Open Access Journals (Sweden)

    Theresa Niederberger

    Full Text Available The Mediator is a highly conserved, large multiprotein complex that is involved essentially in the regulation of eukaryotic mRNA transcription. It acts as a general transcription factor by integrating regulatory signals from gene-specific activators or repressors to the RNA Polymerase II. The internal network of interactions between Mediator subunits that conveys these signals is largely unknown. Here, we introduce MC EMiNEM, a novel method for the retrieval of functional dependencies between proteins that have pleiotropic effects on mRNA transcription. MC EMiNEM is based on Nested Effects Models (NEMs, a class of probabilistic graphical models that extends the idea of hierarchical clustering. It combines mode-hopping Monte Carlo (MC sampling with an Expectation-Maximization (EM algorithm for NEMs to increase sensitivity compared to existing methods. A meta-analysis of four Mediator perturbation studies in Saccharomyces cerevisiae, three of which are unpublished, provides new insight into the Mediator signaling network. In addition to the known modular organization of the Mediator subunits, MC EMiNEM reveals a hierarchical ordering of its internal information flow, which is putatively transmitted through structural changes within the complex. We identify the N-terminus of Med7 as a peripheral entity, entailing only local structural changes upon perturbation, while the C-terminus of Med7 and Med19 appear to play a central role. MC EMiNEM associates Mediator subunits to most directly affected genes, which, in conjunction with gene set enrichment analysis, allows us to construct an interaction map of Mediator subunits and transcription factors.

  18. MC EMiNEM maps the interaction landscape of the Mediator.

    Science.gov (United States)

    Niederberger, Theresa; Etzold, Stefanie; Lidschreiber, Michael; Maier, Kerstin C; Martin, Dietmar E; Fröhlich, Holger; Cramer, Patrick; Tresch, Achim

    2012-01-01

    The Mediator is a highly conserved, large multiprotein complex that is involved essentially in the regulation of eukaryotic mRNA transcription. It acts as a general transcription factor by integrating regulatory signals from gene-specific activators or repressors to the RNA Polymerase II. The internal network of interactions between Mediator subunits that conveys these signals is largely unknown. Here, we introduce MC EMiNEM, a novel method for the retrieval of functional dependencies between proteins that have pleiotropic effects on mRNA transcription. MC EMiNEM is based on Nested Effects Models (NEMs), a class of probabilistic graphical models that extends the idea of hierarchical clustering. It combines mode-hopping Monte Carlo (MC) sampling with an Expectation-Maximization (EM) algorithm for NEMs to increase sensitivity compared to existing methods. A meta-analysis of four Mediator perturbation studies in Saccharomyces cerevisiae, three of which are unpublished, provides new insight into the Mediator signaling network. In addition to the known modular organization of the Mediator subunits, MC EMiNEM reveals a hierarchical ordering of its internal information flow, which is putatively transmitted through structural changes within the complex. We identify the N-terminus of Med7 as a peripheral entity, entailing only local structural changes upon perturbation, while the C-terminus of Med7 and Med19 appear to play a central role. MC EMiNEM associates Mediator subunits to most directly affected genes, which, in conjunction with gene set enrichment analysis, allows us to construct an interaction map of Mediator subunits and transcription factors.

  19. Comparison of tracheal intubation using the Airtraq® and Mc Coy laryngoscope in the presence of rigid cervical collar simulating cervical immobilisation for traumatic cervical spine injury

    Directory of Open Access Journals (Sweden)

    Padmaja Durga

    2012-01-01

    Full Text Available Background: It is difficult to visualise the larynx using conventional laryngoscopy in the presence of cervical spine immobilisation. Airtraq® provides for easy and successful intubation in the neutral neck position. Objective: To evaluate the effectiveness of Airtraq in comparison with the Mc Coy laryngoscope, when performing tracheal intubation in patients with neck immobilisation using hard cervical collar and manual in-line axial cervical spine stabilisation. Methods: A randomised, cross-over, open-labelled study was undertaken in 60 ASA I and II patients aged between 20 and 50 years, belonging to either gender, scheduled to undergo elective surgical procedures. Following induction and adequate muscle relaxation, they were intubated using either of the techniques first, followed by the other. Intubation time and Intubation Difficulty Score (IDS were noted using Mc Coy laryngoscope and Airtraq. The anaesthesiologist was asked to grade the ease of intubation on a Visual Analogue Scale (VAS of 1-10. Chi-square test was used for comparison of categorical data between the groups and paired sample t-test for comparison of continuous data. IDS score and VAS were compared using Wilcoxon Signed ranked test. Results: The mean intubation time was 33.27 sec (13.25 for laryngoscopy and 28.95 sec (18.53 for Airtraq (P=0.32. The median IDS values were 4 (interquartile range (IQR 1-6 and 0 (IQR 0-1 for laryngoscopy and Airtraq, respectively (P=0.007. The median Cormack Lehane glottic view grade was 3 (IQR 2-4 and 1 (IQR 1-1 for laryngoscopy and Airtraq, respectively (P=0.003. The ease of intubation on VAS was graded as 4 (IQR 3-5 for laryngoscopy and 2 (IQR 2-2 for Airtraq (P=0.033. There were two failures to intubate with the Airtraq. Conclusion: Airtraq improves the ease of intubation significantly when compared to Mc Coy blade in patients immobilised with cervical collar and manual in-line stabilisation simulating cervical spine injury.

  20. Mesoscopic simulation of recrystallization and grain growth

    International Nuclear Information System (INIS)

    Rollett, A.D.

    2000-01-01

    A brief summary of simulation techniques for recrystallization and grain growth is given. The available methods include surface evolver, front tracking (including finite element methods and vertex methods), networks of curves, phase field, cellular automata, and Monte Carlo. Two of the models that use a regular lattice, the Potts model and the Cellular Automaton (CA) model, have proved to be very useful. Microstructure is represented on a discrete lattice where the value of the field at each point represents the local orientation of the material and boundaries exist between points of unlike orientation. Two issues are discussed: one is a hybrid approach to combining the standard Monte Carlo and cellular automata algorithms for recrystallization modeling. The second is adaptation of the MC method for modeling grain growth (and recrystallization) with physically based boundary properties. Both models have significant limitations in their standard forms. The CA model is very useful and efficient for simulating recrystallization with deterministic motion of the recrystallization fronts. It can be adapted to simulate curvature driven migration provided that multiple sub-lattices are used with a probabilistic switching rule. The Potts model is very successful in modeling curvature driven boundary migration and grain growth. It does not simulate the proportionality between boundary velocity and a stored energy driving force, however, unless rather restricted conditions of stored energy (in relation to the grain boundary energy) and lattice temperature are satisfied. A new approach based on a hybrid of the Potts model (MC) and the Cellular Automaton (CA) model has been developed to obtain the desired limiting behavior for both curvature-driven and stored energy-driven grain boundary migration. The combination of methods is achieved by interleaving the two different types of reorientation event in time. The results show that the hybrid algorithm models the Gibbs

  1. The Americleft Project: A Modification of Asher-McDade Method for Rating Nasolabial Esthetics in Patients With Unilateral Cleft Lip and Palate Using Q-sort.

    Science.gov (United States)

    Stoutland, Alicia; Long, Ross E; Mercado, Ana; Daskalogiannakis, John; Hathaway, Ronald R; Russell, Kathleen A; Singer, Emily; Semb, Gunvor; Shaw, William C

    2017-11-01

    The purpose of this study was to investigate ways to improve rater reliability and satisfaction in nasolabial esthetic evaluations of patients with complete unilateral cleft lip and palate (UCLP), by modifying the Asher-McDade method with use of Q-sort methodology. Blinded ratings of cropped photographs of one hundred forty-nine 5- to 7-year-old consecutively treated patients with complete UCLP from 4 different centers were used in a rating of frontal and profile nasolabial esthetic outcomes by 6 judges involved in the Americleft Project's intercenter outcome comparisons. Four judges rated in previous studies using the original Asher-McDade approach. For the Q-sort modification, rather than projection of images, each judge had cards with frontal and profile photographs of each patient and rated them on a scale of 1 to 5 for vermillion border, nasolabial frontal, and profile, using the Q-sort method with placement of cards into categories 1 to 5. Inter- and intrarater reliabilities were calculated using the Weighted Kappa (95% confidence interval). For 4 raters, the reliabilities were compared with those in previous studies. There was no significant improvement in inter-rater reliabilities using the new method. Intrarater reliability consistently improved. All raters preferred the Q-sort method with rating cards rather than a PowerPoint of photos, which improved internal consistency in rating compared to previous studies using the original Asher-McDade method. All raters preferred this method because of the ability to continuously compare photos and adjust relative ratings between patients.

  2. Collaborative simulation method with spatiotemporal synchronization process control

    Science.gov (United States)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  3. Software for simulation and design of neutron scattering instrumentation

    DEFF Research Database (Denmark)

    Bertelsen, Mads

    designed using the software. The Union components uses a new approach to simulation of samples in McStas. The properties of a sample are split into geometrical and material, simplifying user input, and allowing the construction of complicated geometries such as sample environments. Multiple scattering...... from conventional choices. Simulation of neutron scattering instrumentation is used when designing instrumentation, but also to understand instrumental effects on the measured scattering data. The Monte Carlo ray-tracing package McStas is among the most popular, capable of simulating the path of each...... neutron through the instrument using an easy to learn language. The subject of the defended thesis is contributions to the McStas language in the form of the software package guide_bot and the Union components.The guide_bot package simplifies the process of optimizing neutron guides by writing the Mc...

  4. Employing a Monte Carlo algorithm in Newton-type methods for restricted maximum likelihood estimation of genetic parameters.

    Directory of Open Access Journals (Sweden)

    Kaarina Matilainen

    Full Text Available Estimation of variance components by Monte Carlo (MC expectation maximization (EM restricted maximum likelihood (REML is computationally efficient for large data sets and complex linear mixed effects models. However, efficiency may be lost due to the need for a large number of iterations of the EM algorithm. To decrease the computing time we explored the use of faster converging Newton-type algorithms within MC REML implementations. The implemented algorithms were: MC Newton-Raphson (NR, where the information matrix was generated via sampling; MC average information(AI, where the information was computed as an average of observed and expected information; and MC Broyden's method, where the zero of the gradient was searched using a quasi-Newton-type algorithm. Performance of these algorithms was evaluated using simulated data. The final estimates were in good agreement with corresponding analytical ones. MC NR REML and MC AI REML enhanced convergence compared to MC EM REML and gave standard errors for the estimates as a by-product. MC NR REML required a larger number of MC samples, while each MC AI REML iteration demanded extra solving of mixed model equations by the number of parameters to be estimated. MC Broyden's method required the largest number of MC samples with our small data and did not give standard errors for the parameters directly. We studied the performance of three different convergence criteria for the MC AI REML algorithm. Our results indicate the importance of defining a suitable convergence criterion and critical value in order to obtain an efficient Newton-type method utilizing a MC algorithm. Overall, use of a MC algorithm with Newton-type methods proved feasible and the results encourage testing of these methods with different kinds of large-scale problem settings.

  5. Idealized Simulations of a Squall Line from the MC3E Field Campaign Applying Three Bin Microphysics Schemes: Dynamic and Thermodynamic Structure

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Lulin [National Center for Atmospheric Research, Boulder, Colorado; Fan, Jiwen [Pacific Northwest National Laboratory, Richland, Washington; Lebo, Zachary J. [University of Wyoming, Laramie, Wyoming; Wu, Wei [National Center for Atmospheric Research, Boulder, Colorado; University of Illinois at Urbana–Champaign, Urbana, Illinois; Morrison, Hugh [National Center for Atmospheric Research, Boulder, Colorado; Grabowski, Wojciech W. [National Center for Atmospheric Research, Boulder, Colorado; Chu, Xia [University of Wyoming, Laramie, Wyoming; Geresdi, István [University of Pécs, Pécs, Hungary; North, Kirk [McGill University, Montréal, Québec, Canada; Stenz, Ronald [University of North Dakota, Grand Forks, North Dakota; Gao, Yang [Pacific Northwest National Laboratory, Richland, Washington; Lou, Xiaofeng [Chinese Academy of Meteorological Sciences, Beijing, China; Bansemer, Aaron [National Center for Atmospheric Research, Boulder, Colorado; Heymsfield, Andrew J. [National Center for Atmospheric Research, Boulder, Colorado; McFarquhar, Greg M. [National Center for Atmospheric Research, Boulder, Colorado; University of Illinois at Urbana–Champaign, Urbana, Illinois; Rasmussen, Roy M. [National Center for Atmospheric Research, Boulder, Colorado

    2017-12-01

    The squall line event on May 20, 2011, during the Midlatitude Continental Convective Clouds (MC3E) field campaign has been simulated by three bin (spectral) microphysics schemes coupled into the Weather Research and Forecasting (WRF) model. Semi-idealized three-dimensional simulations driven by temperature and moisture profiles acquired by a radiosonde released in the pre-convection environment at 1200 UTC in Morris, Oklahoma show that each scheme produced a squall line with features broadly consistent with the observed storm characteristics. However, substantial differences in the details of the simulated dynamic and thermodynamic structure are evident. These differences are attributed to different algorithms and numerical representations of microphysical processes, assumptions of the hydrometeor processes and properties, especially ice particle mass, density, and terminal velocity relationships with size, and the resulting interactions between the microphysics, cold pool, and dynamics. This study shows that different bin microphysics schemes, designed to be conceptually more realistic and thus arguably more accurate than bulk microphysics schemes, still simulate a wide spread of microphysical, thermodynamic, and dynamic characteristics of a squall line, qualitatively similar to the spread of squall line characteristics using various bulk schemes. Future work may focus on improving the representation of ice particle properties in bin schemes to reduce this uncertainty and using the similar assumptions for all schemes to isolate the impact of physics from numerics.

  6. Simulation of Rossi-α method with analog Monte-Carlo method

    International Nuclear Information System (INIS)

    Lu Yuzhao; Xie Qilin; Song Lingli; Liu Hangang

    2012-01-01

    The analog Monte-Carlo code for simulating Rossi-α method based on Geant4 was developed. The prompt neutron decay constant α of six metal uranium configurations in Oak Ridge National Laboratory were calculated. α was also calculated by Burst-Neutron method and the result was consistent with the result of Rossi-α method. There is the difference between results of analog Monte-Carlo simulation and experiment, and the reasons for the difference is the gaps between uranium layers. The influence of gaps decrease as the sub-criticality deepens. The relative difference between results of analog Monte-Carlo simulation and experiment changes from 19% to 0.19%. (authors)

  7. Simulation of temperature distribution in tumor Photothermal treatment

    Science.gov (United States)

    Zhang, Xiyang; Qiu, Shaoping; Wu, Shulian; Li, Zhifang; Li, Hui

    2018-02-01

    The light transmission in biological tissue and the optical properties of biological tissue are important research contents of biomedical photonics. It is of great theoretical and practical significance in medical diagnosis and light therapy of disease. In this paper, the temperature feedback-controller was presented for monitoring photothermal treatment in realtime. Two-dimensional Monte Carlo (MC) and diffuse approximation were compared and analyzed. The results demonstrated that diffuse approximation using extrapolated boundary conditions by finite element method is a good approximation to MC simulation. Then in order to minimize thermal damage, real-time temperature monitoring was appraised by proportional-integral-differential (PID) controller in the process of photothermal treatment.

  8. Construction of the quantitative analysis environment using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Shirakawa, Seiji; Ushiroda, Tomoya; Hashimoto, Hiroshi; Tadokoro, Masanori; Uno, Masaki; Tsujimoto, Masakazu; Ishiguro, Masanobu; Toyama, Hiroshi

    2013-01-01

    The thoracic phantom image was acquisitioned of the axial section to construct maps of the source and density with Monte Carlo (MC) simulation. The phantom was Heart/Liver Type HL (Kyoto Kagaku Co., Ltd.) single photon emission CT (SPECT)/CT machine was Symbia T6 (Siemence) with the collimator LMEGP (low-medium energy general purpose). Maps were constructed from CT images with an in-house software using Visual studio C Sharp (Microsoft). The code simulation of imaging nuclear detectors (SIMIND) was used for MC simulation, Prominence processor (Nihon Medi-Physics) for filter processing and image reconstruction, and the environment DELL Precision T7400 for all image processes. For the actual experiment, the phantom was given 15 MBq of 99m Tc assuming the uptake 2% at the dose of 740 MBq in its myocardial portion and SPECT image was acquisitioned and reconstructed with Butter-worth filter and filter back projection method. CT images were similarly obtained in 0.3 mm thick slices, which were filed in one formatted with digital imaging and communication in medicine (DICOM), and then processed for application to SIMIND for mapping the source and density. Physical and mensuration factors were examined in ideal images by sequential exclusion and simulation of those factors as attenuation, scattering, spatial resolution deterioration and statistical fluctuation. Gamma energy spectrum, SPECT projection and reconstructed images given by the simulation were found to well agree with the actual data, and the precision of MC simulation was confirmed. Physical and mensuration factors were found to be evaluable individually, suggesting the usefulness of the simulation for assessing the precision of their correction. (T.T.)

  9. The Effects of Spatial Diversity and Imperfect Channel Estimation on Wideband MC-DS-CDMA and MC-CDMA

    Science.gov (United States)

    2009-10-01

    In our previous work, we compared the theoretical bit error rates of multi-carrier direct sequence code division multiple access (MC- DS - CDMA ) and...consider only those cases where MC- CDMA has higher frequency diversity than MC- DS - CDMA . Since increases in diversity yield diminishing gains, we conclude

  10. The heart of McDonaldization: the perspective of the other.

    OpenAIRE

    Brimm, Morissa

    2010-01-01

    This dissertation aims to investigate the effect of organisational culture change on job satisfaction and work motivation from the perspective of the employees. McDonald’s Restaurants have been chosen due to having embraced organisational change in the last five years. One particular branch has been chosen – Castle Marina McDonald’s, Nottingham – for analysis. Triangulation was used to collect the data needed for my research; methods used were both qualitative and quantitative to provide rich...

  11. Monte Carlo-based simulation of dynamic jaws tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S. [Department of Molecular Imaging, Radiotherapy and Oncology, Universite Catholique de Louvain, 54 Avenue Hippocrate, 1200 Brussels, Belgium and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 (United States); 21 Century Oncology., 1240 D' onofrio, Madison, Wisconsin 53719 (United States); TomoTherapy Inc., 1240 Deming Way, Madison, Wisconsin 53717 and Department of Medical Physics, University of Wisconsin-Madison, Madison, Wisconsin 53705 (United States); Department of Radiotherapy and Oncology, Universite Catholique de Louvain, St-Luc University Hospital, 10 Avenue Hippocrate, 1200 Brussels (Belgium)

    2011-09-15

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is

  12. Monte Carlo-based simulation of dynamic jaws tomotherapy

    International Nuclear Information System (INIS)

    Sterpin, E.; Chen, Y.; Chen, Q.; Lu, W.; Mackie, T. R.; Vynckier, S.

    2011-01-01

    Purpose: Original TomoTherapy systems may involve a trade-off between conformity and treatment speed, the user being limited to three slice widths (1.0, 2.5, and 5.0 cm). This could be overcome by allowing the jaws to define arbitrary fields, including very small slice widths (<1 cm), which are challenging for a beam model. The aim of this work was to incorporate the dynamic jaws feature into a Monte Carlo (MC) model called TomoPen, based on the MC code PENELOPE, previously validated for the original TomoTherapy system. Methods: To keep the general structure of TomoPen and its efficiency, the simulation strategy introduces several techniques: (1) weight modifiers to account for any jaw settings using only the 5 cm phase-space file; (2) a simplified MC based model called FastStatic to compute the modifiers faster than pure MC; (3) actual simulation of dynamic jaws. Weight modifiers computed with both FastStatic and pure MC were compared. Dynamic jaws simulations were compared with the convolution/superposition (C/S) of TomoTherapy in the ''cheese'' phantom for a plan with two targets longitudinally separated by a gap of 3 cm. Optimization was performed in two modes: asymmetric jaws-constant couch speed (''running start stop,'' RSS) and symmetric jaws-variable couch speed (''symmetric running start stop,'' SRSS). Measurements with EDR2 films were also performed for RSS for the formal validation of TomoPen with dynamic jaws. Results: Weight modifiers computed with FastStatic were equivalent to pure MC within statistical uncertainties (0.5% for three standard deviations). Excellent agreement was achieved between TomoPen and C/S for both asymmetric jaw opening/constant couch speed and symmetric jaw opening/variable couch speed, with deviations well within 2%/2 mm. For RSS procedure, agreement between C/S and measurements was within 2%/2 mm for 95% of the points and 3%/3 mm for 98% of the points, where dose is greater than 30% of the prescription dose (gamma analysis

  13. McCarthy variations in a modal key

    NARCIS (Netherlands)

    van Benthem, J.

    2011-01-01

    We take a fresh look at some major strands in John McCarthy's work from a logician's perspective. First, we re-analyze circumscription in dynamic logics of belief change under hard and soft information. Next, we re-analyze the regression method in the Situation Calculus in terms of update axioms for

  14. The McDonaldization of Higher Education.

    Science.gov (United States)

    Hayes, Dennis, Ed.; Wynyard, Robin, Ed.

    The essays in this collection discuss the future of the university in the context of the "McDonaldization" of society and of academia. The idea of McDonaldization, a term coined by G. Ritzer (1998), provides a tool for looking at the university and its inevitable changes. The chapters are: (1) "Enchanting McUniversity: Toward a…

  15. Fast "coalescent" simulation

    Directory of Open Access Journals (Sweden)

    Wall Jeff D

    2006-03-01

    Full Text Available Abstract Background The amount of genome-wide molecular data is increasing rapidly, as is interest in developing methods appropriate for such data. There is a consequent increasing need for methods that are able to efficiently simulate such data. In this paper we implement the sequentially Markovian coalescent algorithm described by McVean and Cardin and present a further modification to that algorithm which slightly improves the closeness of the approximation to the full coalescent model. The algorithm ignores a class of recombination events known to affect the behavior of the genealogy of the sample, but which do not appear to affect the behavior of generated samples to any substantial degree. Results We show that our software is able to simulate large chromosomal regions, such as those appropriate in a consideration of genome-wide data, in a way that is several orders of magnitude faster than existing coalescent algorithms. Conclusion This algorithm provides a useful resource for those needing to simulate large quantities of data for chromosomal-length regions using an approach that is much more efficient than traditional coalescent models.

  16. Monte Carlo simulations of Higgs-boson production at the LHC with the KrkNLO method

    Energy Technology Data Exchange (ETDEWEB)

    Jadach, S.; Skrzypek, M. [Polish Academy of Sciences, Institute of Nuclear Physics, Krakow (Poland); Nail, G. [University of Manchester, Particle Physics Group, School of Physics and Astronomy, Manchester (United Kingdom); Karlsruhe Institute of Technology, Institute for Theoretical Physics, Karlsruhe (Germany); Placzek, W. [Jagiellonian University, Marian Smoluchowski Institute of Physics, Krakow (Poland); Sapeta, S.; Siodmok, A. [Polish Academy of Sciences, Institute of Nuclear Physics, Krakow (Poland); Theoretical Physics Department, CERN, Geneva (Switzerland)

    2017-03-15

    We present numerical tests and predictions of the KrkNLO method for matching of NLO QCD corrections to hard processes with LO parton-shower Monte Carlo generators (NLO+PS). This method was described in detail in our previous publications, where it was also compared with other NLO+PS matching approaches (MC rate at NLO and POWHEG) as well as fixed-order NLO and NNLO calculations. Here we concentrate on presenting some numerical results (cross sections and distributions) for Z/γ* (Drell-Yan) and Higgs-boson production processes at the LHC. The Drell-Yan process is used mainly to validate the KrkNLO implementation in the Herwig 7 program with respect to the previous implementation in Sherpa. We also show predictions for this process with the new, complete, MC-scheme parton distribution functions and compare them with our previously published results. Then we present the first results of the KrkNLO method for Higgs production in gluon-gluon fusion at the LHC and compare them with MC rate at NLO and POWHEG predictions from Herwig 7, fixed-order results from HNNLO and a resummed calculation from HqT, as well as with experimental data from the ATLAS collaboration. (orig.)

  17. Is Mc Leod's Patent Pending Naturoptic Method for Restoring Healthy Vision Easy and Verifiable?

    Science.gov (United States)

    Niemi, Paul; McLeod, David; McLeod, Roger

    2006-10-01

    RDM asserts that he and people he has trained can assign visual tasks from standard vision assessment charts, or better replacements, proceeding through incremental changes and such rapid improvements that healthy vision can be restored. Mc Leod predicts that in visual tasks with pupil diameter changes, wavelengths change proportionally. A longer, quasimonochromatic wavelength interval is coincident with foveal cones, and rods. A shorter, partially overlapping interval separately aligns with extrafoveal cones. Wavelengths follow the Airy disk radius formula. Niemi can evaluate if it is true that visual health merely requires triggering and facilitating the demands of possibly overridden feedback signals. The method and process are designed so that potential Naturopathic and other select graduate students should be able to self-fund their higher- level educations from preferential franchising arrangements of earnings while they are in certain programs.

  18. The MC21 Monte Carlo Transport Code

    International Nuclear Information System (INIS)

    Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H

    2007-01-01

    MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities

  19. McSnow: A Monte-Carlo Particle Model for Riming and Aggregation of Ice Particles in a Multidimensional Microphysical Phase Space

    Science.gov (United States)

    Brdar, S.; Seifert, A.

    2018-01-01

    We present a novel Monte-Carlo ice microphysics model, McSnow, to simulate the evolution of ice particles due to deposition, aggregation, riming, and sedimentation. The model is an application and extension of the super-droplet method of Shima et al. (2009) to the more complex problem of rimed ice particles and aggregates. For each individual super-particle, the ice mass, rime mass, rime volume, and the number of monomers are predicted establishing a four-dimensional particle-size distribution. The sensitivity of the model to various assumptions is discussed based on box model and one-dimensional simulations. We show that the Monte-Carlo method provides a feasible approach to tackle this high-dimensional problem. The largest uncertainty seems to be related to the treatment of the riming processes. This calls for additional field and laboratory measurements of partially rimed snowflakes.

  20. Study on influences of TiN capping layer on time-dependent dielectric breakdown characteristic of ultra-thin EOT high- k metal gate NMOSFET with kMC TDDB simulations

    International Nuclear Information System (INIS)

    Xu Hao; Yang Hong; Luo Wei-Chun; Xu Ye-Feng; Wang Yan-Rong; Tang Bo; Wang Wen-Wu; Qi Lu-Wei; Li Jun-Feng; Yan Jiang; Zhu Hui-Long; Zhao Chao; Chen Da-Peng; Ye Tian-Chun

    2016-01-01

    The thickness effect of the TiN capping layer on the time dependent dielectric breakdown (TDDB) characteristic of ultra-thin EOT high- k metal gate NMOSFET is investigated in this paper. Based on experimental results, it is found that the device with a thicker TiN layer has a more promising reliability characteristic than that with a thinner TiN layer. From the charge pumping measurement and secondary ion mass spectroscopy (SIMS) analysis, it is indicated that the sample with the thicker TiN layer introduces more Cl passivation at the IL/Si interface and exhibits a lower interface trap density. In addition, the influences of interface and bulk trap density ratio N it / N ot are studied by TDDB simulations through combining percolation theory and the kinetic Monte Carlo (kMC) method. The lifetime reduction and Weibull slope lowering are explained by interface trap effects for TiN capping layers with different thicknesses. (paper)

  1. Comparison of film measurements and Monte Carlo simulations of dose delivered with very high-energy electron beams in a polystyrene phantom

    Energy Technology Data Exchange (ETDEWEB)

    Bazalova-Carter, Magdalena; Liu, Michael; Palma, Bianey; Koong, Albert C.; Maxim, Peter G., E-mail: Peter.Maxim@Stanford.edu, E-mail: BWLoo@Stanford.edu; Loo, Billy W., E-mail: Peter.Maxim@Stanford.edu, E-mail: BWLoo@Stanford.edu [Department of Radiation Oncology, Stanford University, Stanford, California 94305-5847 (United States); Dunning, Michael; McCormick, Doug; Hemsing, Erik; Nelson, Janice; Jobe, Keith; Colby, Eric; Tantawi, Sami; Dolgashev, Valery [SLAC National Accelerator Laboratory, Menlo Park, California 94025 (United States)

    2015-04-15

    Purpose: To measure radiation dose in a water-equivalent medium from very high-energy electron (VHEE) beams and make comparisons to Monte Carlo (MC) simulation results. Methods: Dose in a polystyrene phantom delivered by an experimental VHEE beam line was measured with Gafchromic films for three 50 MeV and two 70 MeV Gaussian beams of 4.0–6.9 mm FWHM and compared to corresponding MC-simulated dose distributions. MC dose in the polystyrene phantom was calculated with the EGSnrc/BEAMnrc and DOSXYZnrc codes based on the experimental setup. Additionally, the effect of 2% beam energy measurement uncertainty and possible non-zero beam angular spread on MC dose distributions was evaluated. Results: MC simulated percentage depth dose (PDD) curves agreed with measurements within 4% for all beam sizes at both 50 and 70 MeV VHEE beams. Central axis PDD at 8 cm depth ranged from 14% to 19% for the 5.4–6.9 mm 50 MeV beams and it ranged from 14% to 18% for the 4.0–4.5 mm 70 MeV beams. MC simulated relative beam profiles of regularly shaped Gaussian beams evaluated at depths of 0.64 to 7.46 cm agreed with measurements to within 5%. A 2% beam energy uncertainty and 0.286° beam angular spread corresponded to a maximum 3.0% and 3.8% difference in depth dose curves of the 50 and 70 MeV electron beams, respectively. Absolute dose differences between MC simulations and film measurements of regularly shaped Gaussian beams were between 10% and 42%. Conclusions: The authors demonstrate that relative dose distributions for VHEE beams of 50–70 MeV can be measured with Gafchromic films and modeled with Monte Carlo simulations to an accuracy of 5%. The reported absolute dose differences likely caused by imperfect beam steering and subsequent charge loss revealed the importance of accurate VHEE beam control and diagnostics.

  2. Improving the capability of an integrated CA-Markov model to simulate spatio-temporal urban growth trends using an Analytical Hierarchy Process and Frequency Ratio

    Science.gov (United States)

    Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan

    2017-07-01

    The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.

  3. Experimental study of biotin-avidin pretargeting technique for anti-CEA McAb radioimmunoimaging

    International Nuclear Information System (INIS)

    Sun Jianzhong; Zhu Chengmo; Guan Liang; Li Biao; Zhang Jixian; Shi Ailan; Zhang Suyin

    1996-01-01

    Biotin-avidin pretargeting technique was used in promoting the diagnostic efficacy of anti-CEA McAb radioimmunoimaging. CEA McAb was conjugated with biotin McAb (B-McAb), streptavidin (SA) was labeled with 131 I ( 131 I-SA) and DTPA-biotin with 111 In( 111 In-DTPA-B). Experimental human colonic tumor bearing nude mice were used. Two step method: B-McAb was preinjected, followed by 131 I SA 48h later, 24, 48, 96 and 120 h postinjection, γ-imaging and biodistribution were studied. Three step method: B-McAb was preinjected, followed by cold SA 24h later and 111 In-DTPA-B another 24h later. 2,6,24 and 48h postinjection, γ-imaging and biodistribution were also studied. Two step method: T/NT of all organs in experimental group was significantly increased compared with controls. The blood T/NT in experimental group and control group at 24 and 120h was 1.11:0.42 and 8.58:3.51, respectively. Tumor % ID/g in all organs slightly decreased compared with direct group. In γ-imaging radioactivity has been accumulated in tumor site as early as 24h, while only slightly visualized or non-visualized in controls. Three step method: in experimental group the blood T/NT reached 4.19 at 2 h, whereas all was < 1.37 at each phase of controls, the T/NT of all organs was also higher in experimental grouped than in controls. The tumor % ID/g in experimental group was 9.72% at 2h and 3.65% at 48h whereas % ID/g in controls in all phases was <3.07. The tumor clearly visualized at 2h and clearer at 48h in γ-imaging. In controls, the tumor was slightly visualized also to early stage, but faded away later on. Biotin-avidin pretargeting technique can elevate the T/NT ratio and decrease the blood background. Early imaging was obtained with better imaging quality

  4. Real-time hybrid simulation using the convolution integral method

    International Nuclear Information System (INIS)

    Kim, Sung Jig; Christenson, Richard E; Wojtkiewicz, Steven F; Johnson, Erik A

    2011-01-01

    This paper proposes a real-time hybrid simulation method that will allow complex systems to be tested within the hybrid test framework by employing the convolution integral (CI) method. The proposed CI method is potentially transformative for real-time hybrid simulation. The CI method can allow real-time hybrid simulation to be conducted regardless of the size and complexity of the numerical model and for numerical stability to be ensured in the presence of high frequency responses in the simulation. This paper presents the general theory behind the proposed CI method and provides experimental verification of the proposed method by comparing the CI method to the current integration time-stepping (ITS) method. Real-time hybrid simulation is conducted in the Advanced Hazard Mitigation Laboratory at the University of Connecticut. A seismically excited two-story shear frame building with a magneto-rheological (MR) fluid damper is selected as the test structure to experimentally validate the proposed method. The building structure is numerically modeled and simulated, while the MR damper is physically tested. Real-time hybrid simulation using the proposed CI method is shown to provide accurate results

  5. Nested MC-Based Risk Measurement of Complex Portfolios: Acceleration and Energy Efficiency

    Directory of Open Access Journals (Sweden)

    Sascha Desmettre

    2016-10-01

    Full Text Available Risk analysis and management currently have a strong presence in financial institutions, where high performance and energy efficiency are key requirements for acceleration systems, especially when it comes to intraday analysis. In this regard, we approach the estimation of the widely-employed portfolio risk metrics value-at-risk (VaR and conditional value-at-risk (cVaR by means of nested Monte Carlo (MC simulations. We do so by combining theory and software/hardware implementation. This allows us for the first time to investigate their performance on heterogeneous compute systems and across different compute platforms, namely central processing unit (CPU, many integrated core (MIC architecture XeonPhi, graphics processing unit (GPU, and field-programmable gate array (FPGA. To this end, the OpenCL framework is employed to generate portable code, and the size of the simulations is scaled in order to evaluate variations in performance. Furthermore, we assess different parallelization schemes, and the targeted platforms are evaluated and compared in terms of runtime and energy efficiency. Our implementation also allowed us to derive a new algorithmic optimization regarding the generation of the required random number sequences. Moreover, we provide specific guidelines on how to properly handle these sequences in portable code, and on how to efficiently implement nested MC-based VaR and cVaR simulations on heterogeneous compute systems.

  6. The MC4 receptor and control of appetite

    NARCIS (Netherlands)

    Adan, R. A. H.; Tiesjema, B.; Hillebrand, J. J. G.; La Fleur, S. E.; Kas, M. J. H.; de Krom, M.

    2006-01-01

    Mutations in the human melanocortin (MC)4 receptor have been associated with obesity, which underscores the relevance of this receptor as a drug target to treat obesity. Infusion of MC4R agonists decreases food intake, whereas inhibition of MC receptor activity by infusion of an MC receptor

  7. Rapid simultaneous high-resolution mapping of myelin water fraction and relaxation times in human brain using BMC-mcDESPOT.

    Science.gov (United States)

    Bouhrara, Mustapha; Spencer, Richard G

    2017-02-15

    A number of central nervous system (CNS) diseases exhibit changes in myelin content and magnetic resonance longitudinal, T 1 , and transverse, T 2 , relaxation times, which therefore represent important biomarkers of CNS pathology. Among the methods applied for measurement of myelin water fraction (MWF) and relaxation times, the multicomponent driven equilibrium single pulse observation of T 1 and T 2 (mcDESPOT) approach is of particular interest. mcDESPOT permits whole brain mapping of multicomponent T 1 and T 2 , with data acquisition accomplished within a clinically realistic acquisition time. Unfortunately, previous studies have indicated the limited performance of mcDESPOT in the setting of the modest signal-to-noise range of high-resolution mapping, required for the depiction of small structures and to reduce partial volume effects. Recently, we showed that a new Bayesian Monte Carlo (BMC) analysis substantially improved determination of MWF from mcDESPOT imaging data. However, our previous study was limited in that it did not discuss determination of relaxation times. Here, we extend the BMC analysis to the simultaneous determination of whole-brain MWF and relaxation times using the two-component mcDESPOT signal model. Simulation analyses and in-vivo human brain studies indicate the overall greater performance of this approach compared to the stochastic region contraction (SRC) algorithm, conventionally used to derive parameter estimates from mcDESPOT data. SRC estimates of the transverse relaxation time of the long T 2 fraction, T 2,l , and the longitudinal relaxation time of the short T 1 fraction, T 1,s , clustered towards the lower and upper parameter search space limits, respectively, indicating failure of the fitting procedure. We demonstrate that this effect is absent in the BMC analysis. Our results also showed improved parameter estimation for BMC as compared to SRC for high-resolution mapping. Overall we find that the combination of BMC analysis

  8. Barbara McClintock, Jumping Genes, and Transposition

    Science.gov (United States)

    McClintock Honored * Woman of Science * Educational Material * Resources with Additional Information Barbara McClintock's remarkable life spanned the history of genetics in the twentieth century. ... [T]he science of Dedicate Famous Scientist Stamps ... Woman of Science: McClintock, Barbara and the Jumping Genes, 4,000

  9. Maiorana-McFarland class: Degree optimization and algebraic properties

    DEFF Research Database (Denmark)

    Pasalic, Enes

    2006-01-01

    degree of functions in the extended Maiorana-McFarland (MM) class (nonlinear resilient functions F : GF (2)(n) -> GF (2)(m) derived from linear codes). We also show that in the Boolean case, the same subclass seems not to have an optimized algebraic immunity, hence not providing a maximum resistance......In this paper, we consider a subclass of the Maiorana-McFarland class used in the design of resilient nonlinear Boolean functions. We show that these functions allow a simple modification so that resilient Boolean functions of maximum algebraic degree may be generated instead of suboptimized degree...... in the original class. Preserving a high-nonlinearity value immanent to the original construction method, together with the degree optimization gives in many cases functions with cryptographic properties superior to all previously known construction methods. This approach is then used to increase the algebraic...

  10. Matrix method for acoustic levitation simulation.

    Science.gov (United States)

    Andrade, Marco A B; Perez, Nicolas; Buiochi, Flavio; Adamowski, Julio C

    2011-08-01

    A matrix method is presented for simulating acoustic levitators. A typical acoustic levitator consists of an ultrasonic transducer and a reflector. The matrix method is used to determine the potential for acoustic radiation force that acts on a small sphere in the standing wave field produced by the levitator. The method is based on the Rayleigh integral and it takes into account the multiple reflections that occur between the transducer and the reflector. The potential for acoustic radiation force obtained by the matrix method is validated by comparing the matrix method results with those obtained by the finite element method when using an axisymmetric model of a single-axis acoustic levitator. After validation, the method is applied in the simulation of a noncontact manipulation system consisting of two 37.9-kHz Langevin-type transducers and a plane reflector. The manipulation system allows control of the horizontal position of a small levitated sphere from -6 mm to 6 mm, which is done by changing the phase difference between the two transducers. The horizontal position of the sphere predicted by the matrix method agrees with the horizontal positions measured experimentally with a charge-coupled device camera. The main advantage of the matrix method is that it allows simulation of non-symmetric acoustic levitators without requiring much computational effort.

  11. ExMC Technology Watch

    Science.gov (United States)

    Krihak, M.; Barr, Y.; Watkins, S.; Fung, P.; McGrath, T.; Baumann, D.

    2012-01-01

    The Technology Watch (Tech Watch) project is a NASA endeavor conducted under the Human Research Program's (HRP) Exploration Medical Capability (ExMC) element, and focusing on ExMC technology gaps. The project involves several NASA centers, including the Johnson Space Center (JSC), Glenn Research Center (GRC), Ames Research Center (ARC), and the Langley Research Center (LaRC). The objective of Tech Watch is to identify emerging, high-impact technologies that augment current NASA HRP technology development efforts. Identifying such technologies accelerates the development of medical care and research capabilities for the mitigation of potential health issues encountered during human space exploration missions. The aim of this process is to leverage technologies developed by academia, industry and other government agencies and to identify the effective utilization of NASA resources to maximize the HRP return on investment. The establishment of collaborations with these entities is beneficial to technology development, assessment and/or insertion and further NASA's goal to provide a safe and healthy environment for human exploration. In 2011, the major focus areas for Tech Watch included information dissemination, education outreach and public accessibility to technology gaps and gap reports. The dissemination of information was accomplished through site visits to research laboratories and/or companies, and participation at select conferences where Tech Watch objectives and technology gaps were presented. Presentation of such material provided researchers with insights on NASA ExMC needs for space exploration and an opportunity to discuss potential areas of common interest. The second focus area, education outreach, was accomplished via two mechanisms. First, several senior student projects, each related to an ExMC technology gap, were sponsored by the various NASA centers. These projects presented ExMC related technology problems firsthand to collegiate laboratories

  12. Parallel Algorithms for Monte Carlo Particle Transport Simulation on Exascale Computing Architectures

    Science.gov (United States)

    Romano, Paul Kollath

    Monte Carlo particle transport methods are being considered as a viable option for high-fidelity simulation of nuclear reactors. While Monte Carlo methods offer several potential advantages over deterministic methods, there are a number of algorithmic shortcomings that would prevent their immediate adoption for full-core analyses. In this thesis, algorithms are proposed both to ameliorate the degradation in parallel efficiency typically observed for large numbers of processors and to offer a means of decomposing large tally data that will be needed for reactor analysis. A nearest-neighbor fission bank algorithm was proposed and subsequently implemented in the OpenMC Monte Carlo code. A theoretical analysis of the communication pattern shows that the expected cost is O( N ) whereas traditional fission bank algorithms are O(N) at best. The algorithm was tested on two supercomputers, the Intrepid Blue Gene/P and the Titan Cray XK7, and demonstrated nearly linear parallel scaling up to 163,840 processor cores on a full-core benchmark problem. An algorithm for reducing network communication arising from tally reduction was analyzed and implemented in OpenMC. The proposed algorithm groups only particle histories on a single processor into batches for tally purposes---in doing so it prevents all network communication for tallies until the very end of the simulation. The algorithm was tested, again on a full-core benchmark, and shown to reduce network communication substantially. A model was developed to predict the impact of load imbalances on the performance of domain decomposed simulations. The analysis demonstrated that load imbalances in domain decomposed simulations arise from two distinct phenomena: non-uniform particle densities and non-uniform spatial leakage. The dominant performance penalty for domain decomposition was shown to come from these physical effects rather than insufficient network bandwidth or high latency. The model predictions were verified with

  13. An NPT Monte Carlo Molecular Simulation-Based Approach to Investigate Solid-Vapor Equilibrium: Application to Elemental Sulfur-H2S System

    KAUST Repository

    Kadoura, Ahmad Salim; Salama, Amgad; Sun, Shuyu; Sherik, Abdelmounam

    2013-01-01

    In this work, a method to estimate solid elemental sulfur solubility in pure and gas mixtures using Monte Carlo (MC) molecular simulation is proposed. This method is based on Isobaric-Isothermal (NPT) ensemble and the Widom insertion technique

  14. H1 Grid production tool for large scale Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lobodzinski, B; Wissing, Ch [DESY, Hamburg (Germany); Bystritskaya, E; Vorobiew, M [ITEP, Moscow (Russian Federation); Karbach, T M [University of Dortmund (Germany); Mitsyn, S [JINR, Moscow (Russian Federation); Mudrinic, M, E-mail: bogdan.lobodzinski@desy.d [VINS, Belgrad (Serbia)

    2010-04-01

    The H1 Collaboration at HERA has entered the period of high precision analyses based on the final data sample. These analyses require a massive production of simulated Monte Carlo (MC) events. The H1 MC framework (H1MC) is a software for mass MC production on the LCG Grid infrastructure and on a local batch system created by H1 Collaboration. The aim of the tool is a full automatisation of the MC production workflow including management of the MC jobs on the Grid down to copying of the resulting files from the Grid to the H1 mass storage tape device. The H1 MC framework has modular structure, delegating a specific task to each module, including task specific to the H1 experiment: Automatic building of steer and input files, simulation of the H1 detector, reconstruction of particle tracks and post processing calculation. Each module provides data or functionality needed by other modules via a local database. The Grid jobs created for detector simulation and reconstruction from generated MC input files are fully independent and fault-tolerant for 32 and 64-bit LCG Grid architecture and in Grid running state they can be continuously monitored using Relational Grid Monitoring Architecture (R-GMA) service. To monitor the full production chain and detect potential problems, regular checks of the job state are performed using the local database and the Service Availability Monitoring (SAM) framework. The improved stability of the system has resulted in a dramatic increase in the production rate, which exceeded two billion MC events in 2008.

  15. Performance Analysis of Wavelet Based MC-CDMA System with Implementation of Various Antenna Diversity Schemes

    OpenAIRE

    Islam, Md. Matiqul; Kabir, M. Hasnat; Ullah, Sk. Enayet

    2012-01-01

    The impact of using wavelet based technique on the performance of a MC-CDMA wireless communication system has been investigated. The system under proposed study incorporates Walsh Hadamard codes to discriminate the message signal for individual user. A computer program written in Mathlab source code is developed and this simulation study is made with implementation of various antenna diversity schemes and fading (Rayleigh and Rician) channel. Computer simulation results demonstrate that the p...

  16. Detection of Echinococcus multilocularis by MC-PCR: evaluation of diagnostic sensitivity and specificity without gold standard

    Directory of Open Access Journals (Sweden)

    Helene Wahlström

    2016-03-01

    Full Text Available Introduction: A semi-automated magnetic capture probe-based DNA extraction and real-time PCR method (MC-PCR, allowing for a more efficient large-scale surveillance of Echinococcus multilocularis occurrence, has been developed. The test sensitivity has previously been evaluated using the sedimentation and counting technique (SCT as a gold standard. However, as the sensitivity of the SCT is not 1, test characteristics of the MC-PCR was also evaluated using latent class analysis, a methodology not requiring a gold standard. Materials and methods: Test results, MC-PCR and SCT, from a previous evaluation of the MC-PCR using 177 foxes shot in the spring (n=108 and autumn 2012 (n=69 in high prevalence areas in Switzerland were used. Latent class analysis was used to estimate the test characteristics of the MC-PCR. Although it is not the primary aim of this study, estimates of the test characteristics of the SCT were also obtained. Results and discussion: This study showed that the sensitivity of the MC-PCR was 0.88 [95% posterior credible interval (PCI 0.80–0.93], which was not significantly different than the SCT, 0.83 (95% PCI 0.76–0.88, which is currently considered as the gold standard. The specificity of both tests was high, 0.98 (95% PCI 0.94–0.99 for the MC-PCR and 0.99 (95% PCI 0.99–1 for the SCT. In a previous study, using fox scats from a low prevalence area, the specificity of the MC-PCR was higher, 0.999% (95% PCI 0.997–1. One reason for the lower estimate of the specificity in this study could be that the MC-PCR detects DNA from infected but non-infectious rodents eaten by foxes. When using MC-PCR in low prevalence areas or areas free from the parasite, a positive result in the MC-PCR should be regarded as a true positive. Conclusion: The sensitivity of the MC-PCR (0.88 was comparable to the sensitivity of SCT (0.83.

  17. Sensitivity of Cirrus and Mixed-phase Clouds to the Ice Nuclei Spectra in McRAS-AC: Single Column Model Simulations

    Science.gov (United States)

    Betancourt, R. Morales; Lee, D.; Oreopoulos, L.; Sud, Y. C.; Barahona, D.; Nenes, A.

    2012-01-01

    The salient features of mixed-phase and ice clouds in a GCM cloud scheme are examined using the ice formation parameterizations of Liu and Penner (LP) and Barahona and Nenes (BN). The performance of LP and BN ice nucleation parameterizations were assessed in the GEOS-5 AGCM using the McRAS-AC cloud microphysics framework in single column mode. Four dimensional assimilated data from the intensive observation period of ARM TWP-ICE campaign was used to drive the fluxes and lateral forcing. Simulation experiments where established to test the impact of each parameterization in the resulting cloud fields. Three commonly used IN spectra were utilized in the BN parameterization to described the availability of IN for heterogeneous ice nucleation. The results show large similarities in the cirrus cloud regime between all the schemes tested, in which ice crystal concentrations were within a factor of 10 regardless of the parameterization used. In mixed-phase clouds there are some persistent differences in cloud particle number concentration and size, as well as in cloud fraction, ice water mixing ratio, and ice water path. Contact freezing in the simulated mixed-phase clouds contributed to transfer liquid to ice efficiently, so that on average, the clouds were fully glaciated at T approximately 260K, irrespective of the ice nucleation parameterization used. Comparison of simulated ice water path to available satellite derived observations were also performed, finding that all the schemes tested with the BN parameterization predicted 20 average values of IWP within plus or minus 15% of the observations.

  18. A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation

    Science.gov (United States)

    Byun, K.; Hamlet, A. F.

    2017-12-01

    There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.

  19. Duplicating MC-15 Output with Python and MCNP

    Energy Technology Data Exchange (ETDEWEB)

    McSpaden, Alexander Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-23

    Two Python scripts have been written that process the output files of MCNP6 into a format that mimics the list-mode output of Los Alamos National Laboratory’s MC-15 and NPOD neutron detection systems. This report details the methods implemented in these scripts and instructions on their use.

  20. 2-d Simulations of Test Methods

    DEFF Research Database (Denmark)

    Thrane, Lars Nyholm

    2004-01-01

    One of the main obstacles for the further development of self-compacting concrete is to relate the fresh concrete properties to the form filling ability. Therefore, simulation of the form filling ability will provide a powerful tool in obtaining this goal. In this paper, a continuum mechanical...... approach is presented by showing initial results from 2-d simulations of the empirical test methods slump flow and L-box. This method assumes a homogeneous material, which is expected to correspond to particle suspensions e.g. concrete, when it remains stable. The simulations have been carried out when...... using both a Newton and Bingham model for characterisation of the rheological properties of the concrete. From the results, it is expected that both the slump flow and L-box can be simulated quite accurately when the model is extended to 3-d and the concrete is characterised according to the Bingham...

  1. Multimagnetical simulations

    International Nuclear Information System (INIS)

    Hansmann, U.; Berg, B.A.; Florida State Univ., Tallahassee, FL; Neuhaus, T.

    1992-01-01

    We modified the recently proposed multicanonical MC algorithm for the case of a magnetic field driven order-order phase transition. We test this multimagnetic Monte Carlo algorithm for the D = 2 Ising model at β = 0.5 and simulate square lattices up to size 100 x 100. On these lattices with periodic boundary conditions it is possible to enhance the appearance of order-order interfaces during the simulation by many orders of magnitude as compared to the standard Monte Carlo simulation

  2. Radiotelemetry to estimate stream life of adult chum salmon in the McNeil River, Alaska

    Science.gov (United States)

    Peirce, Joshua M.; Otis, Edward O.; Wipfli, Mark S.; Follmann, Erich H.

    2011-01-01

    Estimating salmon escapement is one of the fundamental steps in managing salmon populations. The area-under-the-curve (AUC) method is commonly used to convert periodic aerial survey counts into annual salmon escapement indices. The AUC requires obtaining accurate estimates of stream life (SL) for target species. Traditional methods for estimating SL (e.g., mark–recapture) are not feasible for many populations. Our objective in this study was to determine the average SL of chum salmon Oncorhynchus keta in the McNeil River, Alaska, through radiotelemetry. During the 2005 and 2006 runs, 155 chum salmon were fitted with mortality-indicating radio tags as they entered the McNeil River and tracked until they died. A combination of remote data loggers, aerial surveys, and foot surveys were used to determine the location of fish and provide an estimate of time of death. Higher predation resulted in tagged fish below McNeil Falls having a significantly shorter SL (12.6 d) than those above (21.9 d). The streamwide average SL (13.8 d) for chum salmon at the McNeil River was lower than the regionwide value (17.5 d) previously used to generate AUC indices of chum salmon escapement for the McNeil River. We conclude that radiotelemetry is an effective tool for estimating SL in rivers not well suited to other methods.

  3. The NASA/industry Design Analysis Methods for Vibrations (DAMVIBS) program: McDonnell-Douglas Helicopter Company achievements

    Science.gov (United States)

    Toossi, Mostafa; Weisenburger, Richard; Hashemi-Kia, Mostafa

    1993-01-01

    This paper presents a summary of some of the work performed by McDonnell Douglas Helicopter Company under NASA Langley-sponsored rotorcraft structural dynamics program known as DAMVIBS (Design Analysis Methods for VIBrationS). A set of guidelines which is applicable to dynamic modeling, analysis, testing, and correlation of both helicopter airframes and a large variety of structural finite element models is presented. Utilization of these guidelines and the key features of their applications to vibration modeling of helicopter airframes are discussed. Correlation studies with the test data, together with the development and applications of a set of efficient finite element model checkout procedures, are demonstrated on a large helicopter airframe finite element model. Finally, the lessons learned and the benefits resulting from this program are summarized.

  4. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  5. Workshop for development of formal MC and A plans

    International Nuclear Information System (INIS)

    Erkkila, B.H.; Hatcher, C.R.; Scott, S.C.; Thomas, K.E.

    1998-01-01

    Upgrades to both physical protection and material controls and accountability (MC and A) are progressing at many nuclear facilities in the Russian Federation. In general, Russian facilities are well prepared to address issues related to physical protection. The infrastructure to plan and implement physical protection upgrades is already in place in Russia. The infrastructure to integrate new and existing MC and A capabilities is not as well developed. The authors experience has shown that working with Russian facility management and technical personnel to draft an MC and A plan provides a way of moving MC and A upgrades forward. Los Alamos has developed a workshop for Russian nuclear facilities to facilitate the preparation of their facility MC and A plans. The workshops have been successful in bringing together facility management, safeguards specialists, and operations personnel to initiate the process of drafting these MC and A plans. The MC and A plans provide the technical basis for scheduling future MC and A upgrades at the facilities. Although facility MC and A plans are site specific, the workshop can be tailored to guide the development of an MC and A plan for any Russian nuclear site

  6. McRunjob: A High Energy Physics Workflow Planner for Grid Production Processing

    CERN Document Server

    Graham, G E; Bertram, I; Graham, Gregory E.; Evans, Dave; Bertram, Iain

    2003-01-01

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job d...

  7. McDonald’s Corporation - 2015 (MCD)

    OpenAIRE

    Alen Badal

    2017-01-01

    McDonald’s Corporation, 2015 is aiming to enlighten the “Experience of the Future” for consumers, with a special focus on the ‘younger’ generation. Beginning in 2015 and moving forward, McDonald’s has operationalized the functions of its strategy to bett er serve consumers with such offerings as trial-testing a build-your-burger strategy with the order being served at the table, known as the “Create Your Taste” program. The restaurant chain has introduced the all-day breakfast menu and ‘McPic...

  8. Evaluation of full-scope simulator testing methods

    International Nuclear Information System (INIS)

    Feher, M.P.; Moray, N.; Senders, J.W.; Biron, K.

    1995-03-01

    This report discusses the use of full scope nuclear power plant simulators in licensing examinations for Unit First Operators of CANDU reactors. The existing literature is reviewed, and an annotated bibliography of the more important sources provided. Since existing methods are judged inadequate, conceptual bases for designing a system for licensing are discussed, and a method proposed which would make use of objective scoring methods based on data collection in full-scope simulators. A field trial of such a method is described. The practicality of such a method is critically discussed and possible advantages of subjective methods of evaluation considered. (author). 32 refs., 1 tab., 4 figs

  9. Evaluation of full-scope simulator testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Feher, M P; Moray, N; Senders, J W; Biron, K [Human Factors North Inc., Toronto, ON (Canada)

    1995-03-01

    This report discusses the use of full scope nuclear power plant simulators in licensing examinations for Unit First Operators of CANDU reactors. The existing literature is reviewed, and an annotated bibliography of the more important sources provided. Since existing methods are judged inadequate, conceptual bases for designing a system for licensing are discussed, and a method proposed which would make use of objective scoring methods based on data collection in full-scope simulators. A field trial of such a method is described. The practicality of such a method is critically discussed and possible advantages of subjective methods of evaluation considered. (author). 32 refs., 1 tab., 4 figs.

  10. Bit Error Rate Analysis for MC-CDMA Systems in Nakagami- Fading Channels

    Directory of Open Access Journals (Sweden)

    Li Zexian

    2004-01-01

    Full Text Available Multicarrier code division multiple access (MC-CDMA is a promising technique that combines orthogonal frequency division multiplexing (OFDM with CDMA. In this paper, based on an alternative expression for the -function, characteristic function and Gaussian approximation, we present a new practical technique for determining the bit error rate (BER of multiuser MC-CDMA systems in frequency-selective Nakagami- fading channels. The results are applicable to systems employing coherent demodulation with maximal ratio combining (MRC or equal gain combining (EGC. The analysis assumes that different subcarriers experience independent fading channels, which are not necessarily identically distributed. The final average BER is expressed in the form of a single finite range integral and an integrand composed of tabulated functions which can be easily computed numerically. The accuracy of the proposed approach is demonstrated with computer simulations.

  11. A comparative study of history-based versus vectorized Monte Carlo methods in the GPU/CUDA environment for a simple neutron eigenvalue problem

    International Nuclear Information System (INIS)

    Liu, T.; Du, X.; Ji, W.; Xu, G.; Brown, F.B.

    2013-01-01

    For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed. (authors)

  12. A comparative study of history-based versus vectorized Monte Carlo methods in the GPU/CUDA environment for a simple neutron eigenvalue problem

    Science.gov (United States)

    Liu, Tianyu; Du, Xining; Ji, Wei; Xu, X. George; Brown, Forrest B.

    2014-06-01

    For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed.

  13. Adolescent Purchasing Behavior at McDonald's and Subway.

    Science.gov (United States)

    Lesser, Lenard I; Kayekjian, Karen C; Velasquez, Paz; Tseng, Chi-Hong; Brook, Robert H; Cohen, Deborah A

    2013-10-01

    To assess whether adolescents purchasing food at a restaurant marketed as "healthy" (Subway) purchase fewer calories than at a competing chain (McDonald's). We studied 97 adolescents who purchased a meal at both restaurants on different days, using each participant as his or her control. We compared the difference in calories purchased by adolescents at McDonald's and Subway in a diverse area of Los Angeles, CA. Adolescents purchased an average of 1,038 calories (standard error of the mean [SEM]: 41) at McDonald's and 955 calories (SEM 39) at Subway. The difference of 83 calories (95% confidence interval [CI]: -20 to 186) was not statistically significant (p = .11). At McDonald's, participants purchased significantly more calories from drinks (151 vs. 61, p McDonald's vs. 35 at Subway, p McDonald's (.15 vs. .57 cups, p McDonald's. Although Subway meals had more vegetables, meals from both restaurants are likely to contribute to overeating. Copyright © 2013 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  14. Comparison Of Simulation Results When Using Two Different Methods For Mold Creation In Moldflow Simulation

    Directory of Open Access Journals (Sweden)

    Kaushikbhai C. Parmar

    2017-04-01

    Full Text Available Simulation gives different results when using different methods for the same simulation. Autodesk Moldflow Simulation software provide two different facilities for creating mold for the simulation of injection molding process. Mold can be created inside the Moldflow or it can be imported as CAD file. The aim of this paper is to study the difference in the simulation results like mold temperature part temperature deflection in different direction time for the simulation and coolant temperature for this two different methods.

  15. SNPs of melanocortin 4 receptor (MC4R) associated with body weight in Beagle dogs.

    Science.gov (United States)

    Zeng, Ruixia; Zhang, Yibo; Du, Peng

    2014-01-01

    Melanocortin 4 receptor (MC4R), which is associated with inherited human obesity, is involoved in food intake and body weight of mammals. To study the relationships between MC4R gene polymorphism and body weight in Beagle dogs, we detected and compared the nucleotide sequence of the whole coding region and 3'- and 5'- flanking regions of the dog MC4R gene (1214 bp). In 120 Beagle dogs, two SNPs (A420C, C895T) were identified and their relation with body weight was analyzed with RFLP-PCR method. The results showed that the SNP at A420C was significantly associated with canine body weight trait when it changed amino acid 101 of the MC4R protein from asparagine to threonine, while canine body weight variations were significant in female dogs when MC4R nonsense mutation at C895T. It suggested that the two SNPs might affect the MC4R gene's function which was relative to body weight in Beagle dogs. Therefore, MC4R was a candidate gene for selecting different size dogs with the MC4R SNPs (A420C, C895T) being potentially valuable as a genetic marker.

  16. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  17. Application of Higher Order Fission Matrix for Real Variance Estimation in McCARD Monte Carlo Eigenvalue Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ho Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2015-05-15

    In a Monte Carlo (MC) eigenvalue calculation, it is well known that the apparent variance of a local tally such as pin power differs from the real variance considerably. The MC method in eigenvalue calculations uses a power iteration method. In the power iteration method, the fission matrix (FM) and fission source density (FSD) are used as the operator and the solution. The FM is useful to estimate a variance and covariance because the FM can be calculated by a few cycle calculations even at inactive cycle. Recently, S. Carney have implemented the higher order fission matrix (HOFM) capabilities into the MCNP6 MC code in order to apply to extend the perturbation theory to second order. In this study, the HOFM capability by the Hotelling deflation method was implemented into McCARD and used to predict the behavior of a real and apparent SD ratio. In the simple 1D slab problems, the Endo's theoretical model predicts well the real to apparent SD ratio. It was noted that the Endo's theoretical model with the McCARD higher mode FS solutions by the HOFM yields much better the real to apparent SD ratio than that with the analytic solutions. In the near future, the application for a high dominance ratio problem such as BEAVRS benchmark will be conducted.

  18. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity over Generalized Fading Channels

    KAUST Repository

    Rached, Nadhir B.; Kammoun, Abla; Alouini, Mohamed-Slim; Tempone, Raul

    2015-01-01

    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closedform expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive Monte Carlo (MC) simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations.

  19. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity over Generalized Fading Channels

    KAUST Repository

    Rached, Nadhir B.

    2015-11-13

    The outage capacity (OC) is among the most important performance metrics of communication systems operating over fading channels. Of interest in the present paper is the evaluation of the OC at the output of the Equal Gain Combining (EGC) and the Maximum Ratio Combining (MRC) receivers. In this case, it can be seen that this problem turns out to be that of computing the Cumulative Distribution Function (CDF) for the sum of independent random variables. Since finding a closedform expression for the CDF of the sum distribution is out of reach for a wide class of commonly used distributions, methods based on Monte Carlo (MC) simulations take pride of price. In order to allow for the estimation of the operating range of small outage probabilities, it is of paramount importance to develop fast and efficient estimation methods as naive Monte Carlo (MC) simulations would require high computational complexity. In this line, we propose in this work two unified, yet efficient, hazard rate twisting Importance Sampling (IS) based approaches that efficiently estimate the OC of MRC or EGC diversity techniques over generalized independent fading channels. The first estimator is shown to possess the asymptotic optimality criterion and applies for arbitrary fading models, whereas the second one achieves the well-desired bounded relative error property for the majority of the well-known fading variates. Moreover, the second estimator is shown to achieve the asymptotic optimality property under the particular Log-normal environment. Some selected simulation results are finally provided in order to illustrate the substantial computational gain achieved by the proposed IS schemes over naive MC simulations.

  20. Atmosphere Re-Entry Simulation Using Direct Simulation Monte Carlo (DSMC Method

    Directory of Open Access Journals (Sweden)

    Francesco Pellicani

    2016-05-01

    Full Text Available Hypersonic re-entry vehicles aerothermodynamic investigations provide fundamental information to other important disciplines like materials and structures, assisting the development of thermal protection systems (TPS efficient and with a low weight. In the transitional flow regime, where thermal and chemical equilibrium is almost absent, a new numerical method for such studies has been introduced, the direct simulation Monte Carlo (DSMC numerical technique. The acceptance and applicability of the DSMC method have increased significantly in the 50 years since its invention thanks to the increase in computer speed and to the parallel computing. Anyway, further verification and validation efforts are needed to lead to its greater acceptance. In this study, the Monte Carlo simulator OpenFOAM and Sparta have been studied and benchmarked against numerical and theoretical data for inert and chemically reactive flows and the same will be done against experimental data in the near future. The results show the validity of the data found with the DSMC. The best setting of the fundamental parameters used by a DSMC simulator are presented for each software and they are compared with the guidelines deriving from the theory behind the Monte Carlo method. In particular, the number of particles per cell was found to be the most relevant parameter to achieve valid and optimized results. It is shown how a simulation with a mean value of one particle per cell gives sufficiently good results with very low computational resources. This achievement aims to reconsider the correct investigation method in the transitional regime where both the direct simulation Monte Carlo (DSMC and the computational fluid-dynamics (CFD can work, but with a different computational effort.

  1. STUDY ON SIMULATION METHOD OF AVALANCHE : FLOW ANALYSIS OF AVALANCHE USING PARTICLE METHOD

    OpenAIRE

    塩澤, 孝哉

    2015-01-01

    In this paper, modeling for the simulation of the avalanche by a particle method is discussed. There are two kinds of the snow avalanches, one is the surface avalanche which shows a smoke-like flow, and another is the total-layer avalanche which shows a flow like Bingham fluid. In the simulation of the surface avalanche, the particle method in consideration of a rotation resistance model is used. The particle method by Bingham fluid is used in the simulation of the total-layer avalanche. At t...

  2. McMYB10 regulates coloration via activating McF3'H and later structural genes in ever-red leaf crabapple.

    Science.gov (United States)

    Tian, Ji; Peng, Zhen; Zhang, Jie; Song, Tingting; Wan, Huihua; Zhang, Meiling; Yao, Yuncong

    2015-09-01

    The ever-red leaf trait, which is important for breeding ornamental and higher anthocyanin plants, rarely appears in Malus families, but little is known about the regulation of anthocyanin biosynthesis involved in the red leaves. In our study, HPLC analysis showed that the anthocyanin concentration in ever-red leaves, especially cyanidin, was significantly higher than that in evergreen leaves. The transcript level of McMYB10 was significantly correlated with anthocyanin synthesis between the 'Royalty' and evergreen leaf 'Flame' cultivars during leaf development. We also found the ever-red leaf colour cultivar 'Royalty' contained the known R6 : McMYB10 sequence, but was not in the evergreen leaf colour cultivar 'Flame', which have been reported in apple fruit. The distinction in promoter region maybe is the main reason why higher expression level of McMYB10 in red foliage crabapple cultivar. Furthermore, McMYB10 promoted anthocyanin biosynthesis in crabapple leaves and callus at low temperatures and during long-day treatments. Both heterologous expression in tobacco (Nicotiana tabacum) and Arabidopsis pap1 mutant, and homologous expression in crabapple and apple suggested that McMYB10 could promote anthocyanins synthesis and enhanced anthocyanin accumulation in plants. Interestingly, electrophoretic mobility shift assays, coupled with yeast one-hybrid analysis, revealed that McMYB10 positively regulates McF3'H via directly binding to AACCTAAC and TATCCAACC motifs in the promoter. To sum up, our results demonstrated that McMYB10 plays an important role in ever-red leaf coloration, by positively regulating McF3'H in crabapple. Therefore, our work provides new perspectives for ornamental fruit tree breeding. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.

  3. A particle-based method for granular flow simulation

    KAUST Repository

    Chang, Yuanzhang; Bao, Kai; Zhu, Jian; Wu, Enhua

    2012-01-01

    We present a new particle-based method for granular flow simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke's law, is included in the momentum governing equation to handle the friction of granular materials. Viscosity force is also added to simulate the dynamic friction for the purpose of smoothing the velocity field and further maintaining the simulation stability. Benefiting from the Lagrangian nature of the SPH method, large flow deformation can be well handled easily and naturally. In addition, a signed distance field is also employed to enforce the solid boundary condition. The experimental results show that the proposed method is effective and efficient for handling the flow of granular materials, and different kinds of granular behaviors can be well simulated by adjusting just one parameter. © 2012 Science China Press and Springer-Verlag Berlin Heidelberg.

  4. A particle-based method for granular flow simulation

    KAUST Repository

    Chang, Yuanzhang

    2012-03-16

    We present a new particle-based method for granular flow simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke\\'s law, is included in the momentum governing equation to handle the friction of granular materials. Viscosity force is also added to simulate the dynamic friction for the purpose of smoothing the velocity field and further maintaining the simulation stability. Benefiting from the Lagrangian nature of the SPH method, large flow deformation can be well handled easily and naturally. In addition, a signed distance field is also employed to enforce the solid boundary condition. The experimental results show that the proposed method is effective and efficient for handling the flow of granular materials, and different kinds of granular behaviors can be well simulated by adjusting just one parameter. © 2012 Science China Press and Springer-Verlag Berlin Heidelberg.

  5. Modeling of the energy savings of variable recruitment McKibben muscle bundles

    Science.gov (United States)

    Meller, Michael A.; Chipka, Jordan B.; Bryant, Matthew J.; Garcia, Ephrahim

    2015-03-01

    McKibben artificial muscles are often utilized in mobile robotic applications that require compliant and light weight actuation capable of producing large forces. In order to increase the endurance of these mobile robotic platforms, actuation efficiency must be addressed. Since pneumatic systems are rarely more than 30% efficient due to the compressibility of the working fluid, the McKibben muscles are hydraulically powered. Additionally, these McKibben artificial muscles utilize an inelastic bladder to reduce the energy losses associated with elastic energy storage in the usual rubber tube bladders. The largest energy losses in traditional valve-controlled hydraulic systems are found in the valving implementation to match the required loads. This is performed by throttling, which results in large pressure drops over the control valves and significant fluid power being wasted as heat. This paper discusses how these throttling losses are reduced by grouping multiple artificial muscles to form a muscle bundle where, like in skeletal muscle, more elements that make up the muscle bundle are recruited to match the load. This greatly lessens the pressure drops by effectively changing the actuator area, leading to much higher efficiencies over a broader operation envelope. Simulations of several different loading scenarios are discussed that reveal the benefits of such an actuation scheme.

  6. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  7. Real-time simulator for designing electron dual scattering foil systems.

    Science.gov (United States)

    Carver, Robert L; Hogstrom, Kenneth R; Price, Michael J; LeBlanc, Justin D; Pitcher, Garrett M

    2014-11-08

    The purpose of this work was to develop a user friendly, accurate, real-time com- puter simulator to facilitate the design of dual foil scattering systems for electron beams on radiotherapy accelerators. The simulator allows for a relatively quick, initial design that can be refined and verified with subsequent Monte Carlo (MC) calculations and measurements. The simulator also is a powerful educational tool. The simulator consists of an analytical algorithm for calculating electron fluence and X-ray dose and a graphical user interface (GUI) C++ program. The algorithm predicts electron fluence using Fermi-Eyges multiple Coulomb scattering theory with the reduced Gaussian formalism for scattering powers. The simulator also estimates central-axis and off-axis X-ray dose arising from the dual foil system. Once the geometry of the accelerator is specified, the simulator allows the user to continuously vary primary scattering foil material and thickness, secondary scat- tering foil material and Gaussian shape (thickness and sigma), and beam energy. The off-axis electron relative fluence or total dose profile and central-axis X-ray dose contamination are computed and displayed in real time. The simulator was validated by comparison of off-axis electron relative fluence and X-ray percent dose profiles with those calculated using EGSnrc MC. Over the energy range 7-20 MeV, using present foils on an Elekta radiotherapy accelerator, the simulator was able to reproduce MC profiles to within 2% out to 20 cm from the central axis. The central-axis X-ray percent dose predictions matched measured data to within 0.5%. The calculation time was approximately 100 ms using a single Intel 2.93 GHz processor, which allows for real-time variation of foil geometrical parameters using slider bars. This work demonstrates how the user-friendly GUI and real-time nature of the simulator make it an effective educational tool for gaining a better understanding of the effects that various system

  8. SU-E-J-82: Intra-Fraction Proton Beam-Range Verification with PET Imaging: Feasibility Studies with Monte Carlo Simulations and Statistical Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Lou, K [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Rice University, Houston, TX (United States); Mirkovic, D; Sun, X; Zhu, X; Poenisch, F; Grosshans, D; Shao, Y [U.T M.D. Anderson Cancer Center, Houston, TX (United States); Clark, J [Rice University, Houston, TX (United States)

    2014-06-01

    Purpose: To study the feasibility of intra-fraction proton beam-range verification with PET imaging. Methods: Two phantoms homogeneous cylindrical PMMA phantoms (290 mm axial length, 38 mm and 200 mm diameter respectively) were studied using PET imaging: a small phantom using a mouse-sized PET (61 mm diameter field of view (FOV)) and a larger phantom using a human brain-sized PET (300 mm FOV). Monte Carlo (MC) simulations (MCNPX and GATE) were used to simulate 179.2 MeV proton pencil beams irradiating the two phantoms and be imaged by the two PET systems. A total of 50 simulations were conducted to generate 50 positron activity distributions and correspondingly 50 measured activity-ranges. The accuracy and precision of these activity-ranges were calculated under different conditions (including count statistics and other factors, such as crystal cross-section). Separate from the MC simulations, an activity distribution measured from a simulated PET image was modeled as a noiseless positron activity distribution corrupted by Poisson counting noise. The results from these two approaches were compared to assess the impact of count statistics on the accuracy and precision of activity-range calculations. Results: MC Simulations show that the accuracy and precision of an activity-range are dominated by the number (N) of coincidence events of the reconstructed image. They are improved in a manner that is inversely proportional to 1/sqrt(N), which can be understood from the statistical modeling. MC simulations also indicate that the coincidence events acquired within the first 60 seconds with 10{sup 9} protons (small phantom) and 10{sup 10} protons (large phantom) are sufficient to achieve both sub-millimeter accuracy and precision. Conclusion: Under the current MC simulation conditions, the initial study indicates that the accuracy and precision of beam-range verification are dominated by count statistics, and intra-fraction PET image-based beam-range verification is

  9. The SO-20.3 MC and A modernization plan final report

    International Nuclear Information System (INIS)

    Longmire, V.L.; Ensslin, Norbert; Files, C.; Joseph, Joshua A. Jr.; Rudy, C.R.; Smith, M.K.; Russo, P.A.; Stevens, R.S.; Strittmatter, R.B.; Wilkey, D.D.; Pickett, C.; Brosey, W.; Swanson, J.

    2004-01-01

    Materials Control and Accountability (MC and A) provides assurance to the nation that nuclear materials are controlled in accordance with their strategic and economic importance and that the misuse, theft or diversion of these materials will be detected. MC and A plays an important part in security at nuclear facilities, especially in addressing threats such as theft of materials, environmental contamination, and nuclear safety incidents associated with nuclear materials. For this reason, it is important that MC and A takes advantage of new technologies and methods in order to provide information on a site's nuclear materials in the most timely and useful manner possible. Within the U.S. Department of Energy (DOE), the Office of Security Policy, Policy Integration and Technical Support Program (SO-20.3) is responsible for the development of safeguards and security technology that enables DOE and NNSA facilities to safeguard their Special Nuclear Materials. The SO-20.3 Program tasked safeguards personnel at Los Alamos National Laboratory to lead a project with representatives from the Y-12 Plant, Lawrence Livermore National Laboratory, and the Savannah River Site to prepare an MC and A Modernization Plan that provides recommendations for development new technologies and methodologies for MC and A in both new and existing DOE facilities. The team was tasked with taking into account new concerns about the protection of nuclear material following the attacks of September 11, 2001. Opportunities for applying new MC and A approaches and technologies that provide increased freedom of operation, increased security and provide a potential for cost savings in existing and new DOE facilities are discussed in this report.

  10. Association between MC4R rs17782313 Polymorphism and Overeating Behaviours

    Science.gov (United States)

    Yilmaz, Zeynep; Davis, Caroline; Loxton, Natalie J.; Kaplan, Allan S.; Levitan, Robert D.; Carter, Jacqueline C.; Kennedy, James L.

    2014-01-01

    Background/Objectives Melanocortins play a crucial role in appetite and weight regulation. Although the melanocortin 4 receptor (MC4R) gene has been repeatedly linked to obesity and antipsychotic-induced weight gain, the mechanism behind how it leads to this effect in still undetermined. The goal of this study was to conduct an in-depth and sophisticated analysis of MC4R polymorphisms, body mass index (BMI), eating behaviour, and depressed mood. Subjects/Methods We genotyped 328 individuals of European ancestry on the following MC4R markers based on the relevant literature on obesity and antipsychotic-induced weight gain: rs571312, rs17782313, rs489693, rs11872992, and rs8087522. Height and weight were measured, and information on depressed mood and overeating behaviours was obtained during the in-person assessment. Results BMI was associated with rs17782313 C allele; however this finding did not survive correction for multiple testing (p=0.018). Although rs17782313 was significantly associated with depressed mood and overeating behaviours, tests of indirect effects indicated that emotional eating and food cravings, rather than depressed mood, uniquely accounted for the effect of this marker and BMI (n=152). Conclusions To our knowledge, this is the first study to investigate the link between MC4R rs17782313, mood and overeating behaviour, as well as to demonstrate possible mechanisms behind MC4R’s influence on body weight. If replicated in a larger sample, these results may have important clinical implications, including potential for the use of MC4R agonists in the treatment of obesity and disordered eating. PMID:24827639

  11. The McClean Lake uranium project

    International Nuclear Information System (INIS)

    Blaise, J.R.

    2001-01-01

    The McClean Lake Uranium Project, located in the northern part of Saskatchewan, consists of five uranium deposits, Jeb - Sue A - Sue B - Sue C - McClean, scattered in three different locations on the mineral lease. On 16 March 1995, COGEMA Resources Inc and its partners, Denison Mines Ltd and OURD (Canada) Co Ltd, made the formal decision to develop the McClean Lake Project. Construction of the mine and mill started during summer 1995 and should be finished by mid 1997. Mining of the first deposit, Jeb started in 1996, ore being currently mined. The start of the yellowcake production is scheduled to start this fall. (author)

  12. An NPT Monte Carlo Molecular Simulation-Based Approach to Investigate Solid-Vapor Equilibrium: Application to Elemental Sulfur-H2S System

    KAUST Repository

    Kadoura, Ahmad Salim

    2013-06-01

    In this work, a method to estimate solid elemental sulfur solubility in pure and gas mixtures using Monte Carlo (MC) molecular simulation is proposed. This method is based on Isobaric-Isothermal (NPT) ensemble and the Widom insertion technique for the gas phase and a continuum model for the solid phase. This method avoids the difficulty of having to deal with high rejection rates that are usually encountered when simulating using Gibbs ensemble. The application of this method is tested with a system made of pure hydrogen sulfide gas (H2S) and solid elemental sulfur. However, this technique may be used for other solid-vapor systems provided the fugacity of the solid phase is known (e.g., through experimental work). Given solid fugacity at the desired pressure and temperature, the mole fraction of the solid dissolved in gas that would be in chemical equilibrium with the solid phase might be obtained. In other words a set of MC molecular simulation experiments is conducted on a single box given the pressure and temperature and for different mole fractions of the solute. The fugacity of the gas mixture is determined using the Widom insertion method and is compared with that predetermined for the solid phase until one finds the mole fraction which achieves the required fugacity. In this work, several examples of MC have been conducted and compared with experimental data. The Lennard-Jones parameters related to the sulfur molecule model (ɛ, σ) have been optimized to achieve better match with the experimental work.

  13. Implatation of MC2 computer code

    International Nuclear Information System (INIS)

    Seehusen, J.; Nair, R.P.K.; Becceneri, J.C.

    1981-01-01

    The implantation of MC2 computer code in the CDC system is presented. The MC2 computer code calculates multigroup cross sections for tipical compositions of fast reactors. The multigroup constants are calculated using solutions of PI or BI approximations for determined buckling value as weighting function. (M.C.K.) [pt

  14. Efficient Monte Carlo Simulations of Gas Molecules Inside Porous Materials.

    Science.gov (United States)

    Kim, Jihan; Smit, Berend

    2012-07-10

    Monte Carlo (MC) simulations are commonly used to obtain adsorption properties of gas molecules inside porous materials. In this work, we discuss various optimization strategies that lead to faster MC simulations with CO2 gas molecules inside host zeolite structures used as a test system. The reciprocal space contribution of the gas-gas Ewald summation and both the direct and the reciprocal gas-host potential energy interactions are stored inside energy grids to reduce the wall time in the MC simulations. Additional speedup can be obtained by selectively calling the routine that computes the gas-gas Ewald summation, which does not impact the accuracy of the zeolite's adsorption characteristics. We utilize two-level density-biased sampling technique in the grand canonical Monte Carlo (GCMC) algorithm to restrict CO2 insertion moves into low-energy regions within the zeolite materials to accelerate convergence. Finally, we make use of the graphics processing units (GPUs) hardware to conduct multiple MC simulations in parallel via judiciously mapping the GPU threads to available workload. As a result, we can obtain a CO2 adsorption isotherm curve with 14 pressure values (up to 10 atm) for a zeolite structure within a minute of total compute wall time.

  15. A systematic framework for Monte Carlo simulation of remote sensing errors map in carbon assessments

    Science.gov (United States)

    S. Healey; P. Patterson; S. Urbanski

    2014-01-01

    Remotely sensed observations can provide unique perspective on how management and natural disturbance affect carbon stocks in forests. However, integration of these observations into formal decision support will rely upon improved uncertainty accounting. Monte Carlo (MC) simulations offer a practical, empirical method of accounting for potential remote sensing errors...

  16. Characteristic Experimentations of Degrader and Scatterer at MC-50 Cyclotron

    CERN Document Server

    Lee Seok Ki; Kim, Kye-Ryung; Lee, Hwa-Ryun; Park, Bum-Sik

    2005-01-01

    Building proton beam user facilities, especially deciding beam energy level, depends on the attached proton accelerator and users' needs. To adjust beam energy level, two methods are generally used. One is to directly adjust the beam in the accelerator. The other is to adjust beam energy after extracting from the accelerator. Degrader/Scatterer System has been installed in the MC-50 Cyclotron to adjust energy level of the beam used for various application fields. Its degrader and scatterer are made of Al foils and Au foils, respectively. Al thickness are 2, 1, 0.5, 0.3, 0.2, 0.1, 0.05, 0.03, 0.02, 0.01mm and Au thickness are 0.2, 0.1, 0.05, 0.03, 0.02, 0.01mm, respectively. In this study, suitable beam condition was adjusted through overlapping Al/Au foils of various thickness through simulation results. After that, LET(Linear Energy Transfer) value was indirectly acquired by measuring the bragg peak of the external beam through PMMA plastic Phantom and profile was measured by film dosimetry.

  17. Age McCanni büroo = Offices of Age McCann

    Index Scriptorium Estoniae

    2010-01-01

    Tallinnas Rotermanni 8 asuva Age McCanni büroo sisekujundusest. Sisekujunduse autorid: sisearhitekt Kerli Valk (Kukuhaus OÜ) ja arhitekt Tomomi Hayashi (HG Arhitektuur OÜ), nende tähtsamate tööde loetelu

  18. Sulfur-induced offsets in MC-ICP-MS silicon-isotope measurements

    NARCIS (Netherlands)

    van den Boorn, S.; Vroon, P.Z.; van Bergen, M.J.

    2009-01-01

    Sample preparation methods for MC-ICP-MS silicon-isotope measurements often involve a cation-exchange purification step. A previous study has argued that this would suffice for geological materials, as the occasional enrichment of anionic species would not compromise silicon-isotope analysis. Here

  19. Sulphur-induced offsets in MC-ICP-MS silicon-isotope measurements

    NARCIS (Netherlands)

    van den Boorn, S.; Vroon, P.Z.; van Bergen, M.J.

    2010-01-01

    Sample preparation methods for MC-ICP-MS silicon-isotope measurements often involve a cation-exchange purification step. A previous study has argued that this would suffice for geological materials, as the occasional enrichment of anionic species would not compromise silicon-isotope analysis. Here

  20. A simple MC-based algorithm for evaluating reliability of stochastic-flow network with unreliable nodes

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2004-01-01

    A MP/minimal cutset (MC) is a path/cut set such that if any edge is removed from this path/cut set, then the remaining set is no longer a path/cut set. An intuitive method is proposed to evaluate the reliability in terms of MCs in a stochastic-flow network subject to both edge and node failures under the condition that all of the MCs are given in advance. This is an extension of the best of known algorithms for solving the d-MC (a special MC but formatted in a system-state vector, where d is the lower bound points of the system capacity level) problem from the stochastic-flow network without unreliable nodes to with unreliable nodes by introducing some simple concepts. These concepts were first developed in the literature to implement the proposed algorithm to reduce the number of d-MC candidates. This method is more efficient than the best of known existing algorithms regardless if the network has or does not have unreliable nodes. Two examples are illustrated to show how the reliability is determined using the proposed algorithm in the network with or without unreliable nodes. The computational complexity of the proposed algorithm is analyzed and compared with the existing methods

  1. Uncertainly propagation analysis for Yonggwang nuclear unit 4 by McCARD/MASTER core analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ho Jin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Dong Hyuk; Shim, Hyung Jin; Kim, Chang Hyo [Seoul National University, Seoul (Korea, Republic of)

    2014-06-15

    This paper concerns estimating uncertainties of the core neutronics design parameters of power reactors by direct sampling method (DSM) calculations based on the two-step McCARD/MASTER design system in which McCARD is used to generate the fuel assembly (FA) homogenized few group constants (FGCs) while MASTER is used to conduct the core neutronics design computation. It presents an extended application of the uncertainty propagation analysis method originally designed for uncertainty quantification of the FA FGCs as a way to produce the covariances between the FGCs of any pair of FAs comprising the core, or the covariance matrix of the FA FGCs required for random sampling of the FA FGCs input sets into direct sampling core calculations by MASTER. For illustrative purposes, the uncertainties of core design parameters such as the effective multiplication factor (k{sub eff}), normalized FA power densities, power peaking factors, etc. for the beginning of life (BOL) core of Yonggwang nuclear unit 4 (YGN4) at the hot zero power and all rods out are estimated by the McCARD/MASTER-based DSM computations. The results are compared with those from the uncertainty propagation analysis method based on the McCARD-predicted sensitivity coefficients of nuclear design parameters and the cross section covariance data.

  2. Monte Carlo dose calculation improvements for low energy electron beams using eMC

    International Nuclear Information System (INIS)

    Fix, Michael K; Frei, Daniel; Volken, Werner; Born, Ernst J; Manser, Peter; Neuenschwander, Hans

    2010-01-01

    The electron Monte Carlo (eMC) dose calculation algorithm in Eclipse (Varian Medical Systems) is based on the macro MC method and is able to predict dose distributions for high energy electron beams with high accuracy. However, there are limitations for low energy electron beams. This work aims to improve the accuracy of the dose calculation using eMC for 4 and 6 MeV electron beams of Varian linear accelerators. Improvements implemented into the eMC include (1) improved determination of the initial electron energy spectrum by increased resolution of mono-energetic depth dose curves used during beam configuration; (2) inclusion of all the scrapers of the applicator in the beam model; (3) reduction of the maximum size of the sphere to be selected within the macro MC transport when the energy of the incident electron is below certain thresholds. The impact of these changes in eMC is investigated by comparing calculated dose distributions for 4 and 6 MeV electron beams at source to surface distance (SSD) of 100 and 110 cm with applicators ranging from 6 x 6 to 25 x 25 cm 2 of a Varian Clinac 2300C/D with the corresponding measurements. Dose differences between calculated and measured absolute depth dose curves are reduced from 6% to less than 1.5% for both energies and all applicators considered at SSD of 100 cm. Using the original eMC implementation, absolute dose profiles at depths of 1 cm, d max and R50 in water lead to dose differences of up to 8% for applicators larger than 15 x 15 cm 2 at SSD 100 cm. Those differences are now reduced to less than 2% for all dose profiles investigated when the improved version of eMC is used. At SSD of 110 cm the dose difference for the original eMC version is even more pronounced and can be larger than 10%. Those differences are reduced to within 2% or 2 mm with the improved version of eMC. In this work several enhancements were made in the eMC algorithm leading to significant improvements in the accuracy of the dose calculation

  3. Monte Carlo dose calculation improvements for low energy electron beams using eMC.

    Science.gov (United States)

    Fix, Michael K; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J; Manser, Peter

    2010-08-21

    The electron Monte Carlo (eMC) dose calculation algorithm in Eclipse (Varian Medical Systems) is based on the macro MC method and is able to predict dose distributions for high energy electron beams with high accuracy. However, there are limitations for low energy electron beams. This work aims to improve the accuracy of the dose calculation using eMC for 4 and 6 MeV electron beams of Varian linear accelerators. Improvements implemented into the eMC include (1) improved determination of the initial electron energy spectrum by increased resolution of mono-energetic depth dose curves used during beam configuration; (2) inclusion of all the scrapers of the applicator in the beam model; (3) reduction of the maximum size of the sphere to be selected within the macro MC transport when the energy of the incident electron is below certain thresholds. The impact of these changes in eMC is investigated by comparing calculated dose distributions for 4 and 6 MeV electron beams at source to surface distance (SSD) of 100 and 110 cm with applicators ranging from 6 x 6 to 25 x 25 cm(2) of a Varian Clinac 2300C/D with the corresponding measurements. Dose differences between calculated and measured absolute depth dose curves are reduced from 6% to less than 1.5% for both energies and all applicators considered at SSD of 100 cm. Using the original eMC implementation, absolute dose profiles at depths of 1 cm, d(max) and R50 in water lead to dose differences of up to 8% for applicators larger than 15 x 15 cm(2) at SSD 100 cm. Those differences are now reduced to less than 2% for all dose profiles investigated when the improved version of eMC is used. At SSD of 110 cm the dose difference for the original eMC version is even more pronounced and can be larger than 10%. Those differences are reduced to within 2% or 2 mm with the improved version of eMC. In this work several enhancements were made in the eMC algorithm leading to significant improvements in the accuracy of the dose

  4. Development on hybrid evaluated nuclear data library HENDL1.0/MG/MC

    International Nuclear Information System (INIS)

    Xu Dezheng; Gao Chunjing; Zheng Shanliang; Liu Haibo; Zhu Xiaoxiang; Li Jingjing; Wu Yican

    2004-01-01

    A Hybrid Evaluated Nuclear Data Library (HENDL) named as HENDL1.0 has been developed by Fusion Design Study (FDS) team of Institute of Plasma Physics, Academia Sinica (ASIPP) to take into account the requirements in design and research relevant to fusion, fission and fusion-fission sub-critical hybrid reactor. HENDLI1.0 contains one basic evaluated sub-library naming HENDL1.0/E and to processed working sub-libraries naming HENDL1.0/MG and HENDL1.0/MC, respectively. Through carefully comparing, distinguishing and choosing, HENDL1.0/E integrated basic evaluated neutron data files of 213 nuclides from the several main data libraries for evaluated neutron reaction cross sections including ENDF/B-VI (USA), JEF-2.2 (OECD/NEA, Europe), JENDL-3.2 (Japan), CENDL-2 (China), BROND-2 (Russia) and FENDL-2 (IAEA/NDS, ITER program). Based on this, 175-group neutron and 42-group photon neutron-photon coupled multi-group working library HENDL1.0/MG used for discrete ordinate Sn method transport calculation (such as ANISN code) and a compact ENDF form (ACE), continuous energy structure (pointwise) neutron cross section library HENDL1.0/MC for Monte Carlo method transport simulation (as MCMP code) can be attainable with the current group constants processing system NJOY and transport cross section preparation code TRANSX referring to the Vitamin-J energy group structure. In addition, two special bases i.e. transmutation (burnup) library BURNUP. DAT and response function library RESPONSE.DAT, have been also made for fuel cycle calculation and reactivity analyses of nuclear reactor. The relevant sample testing, benchmark checking and primary confirmation are also carried out to assess the validity of multi-purpose data library HENDL1.0. (authors)

  5. MC and A system design workshop

    International Nuclear Information System (INIS)

    Schneider, R.A.; Harms, N.L.

    1984-01-01

    The workshop had as its goal the development of a Material Control and Accounting (MC and A) system for a low enriched uranium fuel fabrication plant. The factors to be considered for each of the ten key elements of the safeguards (MC and A) are presented in the text for the session

  6. Multiple time-scale methods in particle simulations of plasmas

    International Nuclear Information System (INIS)

    Cohen, B.I.

    1985-01-01

    This paper surveys recent advances in the application of multiple time-scale methods to particle simulation of collective phenomena in plasmas. These methods dramatically improve the efficiency of simulating low-frequency kinetic behavior by allowing the use of a large timestep, while retaining accuracy. The numerical schemes surveyed provide selective damping of unwanted high-frequency waves and preserve numerical stability in a variety of physics models: electrostatic, magneto-inductive, Darwin and fully electromagnetic. The paper reviews hybrid simulation models, the implicitmoment-equation method, the direct implicit method, orbit averaging, and subcycling

  7. Application of the direct simulation Monte Carlo method to nanoscale heat transfer between a soot particle and the surrounding gas

    International Nuclear Information System (INIS)

    Yang, M.; Liu, F.; Smallwood, G.J.

    2004-01-01

    Laser-Induced Incandescence (LII) technique has been widely used to measure soot volume fraction and primary particle size in flames and engine exhaust. Currently there is lack of quantitative understanding of the shielding effect of aggregated soot particles on its conduction heat loss rate to the surrounding gas. The conventional approach for this problem would be the application of the Monte Carlo (MC) method. This method is based on simulation of the trajectories of individual molecules and calculation of the heat transfer at each of the molecule/molecule collisions and the molecule/particle collisions. As the first step toward calculating the heat transfer between a soot aggregate and the surrounding gas, the Direct Simulation Monte Carlo (DSMC) method was used in this study to calculate the heat transfer rate between a single spherical aerosol particle and its cooler surrounding gas under different conditions of temperature, pressure, and the accommodation coefficient. A well-defined and simple hard sphere model was adopted to describe molecule/molecule elastic collisions. A combination of the specular reflection and completely diffuse reflection model was used to consider molecule/particle collisions. The results obtained by DSMC are in good agreement with the known analytical solution of heat transfer rate for an isolated, motionless sphere in the free-molecular regime. Further the DSMC method was applied to calculate the heat transfer in the transition regime. Our present DSMC results agree very well with published DSMC data. (author)

  8. A track length estimator method for dose calculations in low-energy X-ray irradiations. Implementation, properties and performance

    Energy Technology Data Exchange (ETDEWEB)

    Baldacci, F.; Delaire, F.; Letang, J.M.; Sarrut, D.; Smekens, F.; Freud, N. [Lyon-1 Univ. - CREATIS, CNRS UMR5220, Inserm U1044, INSA-Lyon, Centre Leon Berard (France); Mittone, A.; Coan, P. [LMU Munich (Germany). Dept. of Physics; LMU Munich (Germany). Faculty of Medicine; Bravin, A.; Ferrero, C. [European Synchrotron Radiation Facility, Grenoble (France); Gasilov, S. [LMU Munich (Germany). Dept. of Physics

    2015-05-01

    The track length estimator (TLE) method, an 'on-the-fly' fluence tally in Monte Carlo (MC) simulations, recently implemented in GATE 6.2, is known as a powerful tool to accelerate dose calculations in the domain of low-energy X-ray irradiations using the kerma approximation. Overall efficiency gains of the TLE with respect to analogous MC were reported in the literature for regions of interest in various applications (photon beam radiation therapy, X-ray imaging). The behaviour of the TLE method in terms of statistical properties, dose deposition patterns, and computational efficiency compared to analogous MC simulations was investigated. The statistical properties of the dose deposition were first assessed. Derivations of the variance reduction factor of TLE versus analogous MC were carried out, starting from the expression of the dose estimate variance in the TLE and analogous MC schemes. Two test cases were chosen to benchmark the TLE performance in comparison with analogous MC: (i) a small animal irradiation under stereotactic synchrotron radiation therapy conditions and (ii) the irradiation of a human pelvis during a cone beam computed tomography acquisition. Dose distribution patterns and efficiency gain maps were analysed. The efficiency gain exhibits strong variations within a given irradiation case, depending on the geometrical (voxel size, ballistics) and physical (material and beam properties) parameters on the voxel scale. Typical values lie between 10 and 103, with lower levels in dense regions (bone) outside the irradiated channels (scattered dose only), and higher levels in soft tissues directly exposed to the beams.

  9. Stochastic Rotation Dynamics simulations of wetting multi-phase flows

    Science.gov (United States)

    Hiller, Thomas; Sanchez de La Lama, Marta; Brinkmann, Martin

    2016-06-01

    Multi-color Stochastic Rotation Dynamics (SRDmc) has been introduced by Inoue et al. [1,2] as a particle based simulation method to study the flow of emulsion droplets in non-wetting microchannels. In this work, we extend the multi-color method to also account for different wetting conditions. This is achieved by assigning the color information not only to fluid particles but also to virtual wall particles that are required to enforce proper no-slip boundary conditions. To extend the scope of the original SRDmc algorithm to e.g. immiscible two-phase flow with viscosity contrast we implement an angular momentum conserving scheme (SRD+mc). We perform extensive benchmark simulations to show that a mono-phase SRDmc fluid exhibits bulk properties identical to a standard SRD fluid and that SRDmc fluids are applicable to a wide range of immiscible two-phase flows. To quantify the adhesion of a SRD+mc fluid in contact to the walls we measure the apparent contact angle from sessile droplets in mechanical equilibrium. For a further verification of our wettability implementation we compare the dewetting of a liquid film from a wetting stripe to experimental and numerical studies of interfacial morphologies on chemically structured surfaces.

  10. Simulated Vegetation Response to Climate Change in California: The Importance of Seasonal Production Patterns

    Science.gov (United States)

    Kim, J. B.; Pitts, B.

    2013-12-01

    MC1 dynamic global vegetation model simulates vegetation response to climate change by simulating vegetation production, soil biogeochemistry, plant biogeography and fire. It has been applied at a wide range of spatial scales, yet the spatio-temporal patterns of simulated vegetation production, which drives the model's response to climate change, has not been examined in detail. We ran MC1 for California at a relatively fine scale, 30 arc-seconds, for the historical period (1895-2006) and for the future (2007-2100), using downscaled data from four CMIP3-based climate projections: A2 and B1 GHG emissions scenarios simulated by PCM and GFDL GCMs. The use of these four climate projections aligns our work with a body of climate change research work commissioned by the California Public Interest Energy Research (PIER) Program. The four climate projections vary not only in terms of changes in their annual means, but in the seasonality of projected climate change. We calibrated MC1 using MODIS NPP data for 2000-2011 as a guide, and adapting a published technique for adjusting simulated vegetation production by increasing the simulated plant rooting depths. We evaluated the simulation results by comparing the model output for the historical period with several benchmark datasets, summarizing by EPA Level 3 Ecoregions. Multi-year summary statistics of model predictions compare moderately well with Kuchler's potential natural vegetation map, National Biomass and Carbon Dataset, Leenhouts' compilation of fire return intervals, and, of course, the MODIS NPP data for 2000-2011. When we compared MC1's monthly NPP values with MODIS monthly GPP data (2000-2011), however, the seasonal patterns compared very poorly, with NPP/GPP ratio for spring (Mar-Apr-May) often exceeding 1, and the NPP/GPP ratio for summer (Jun-Jul-Aug) often flattening to zero. This suggests MC1's vegetation production algorithms are overly biased for spring production at the cost of summer production. We

  11. Sampling Enrichment toward Target Structures Using Hybrid Molecular Dynamics-Monte Carlo Simulations.

    Directory of Open Access Journals (Sweden)

    Kecheng Yang

    Full Text Available Sampling enrichment toward a target state, an analogue of the improvement of sampling efficiency (SE, is critical in both the refinement of protein structures and the generation of near-native structure ensembles for the exploration of structure-function relationships. We developed a hybrid molecular dynamics (MD-Monte Carlo (MC approach to enrich the sampling toward the target structures. In this approach, the higher SE is achieved by perturbing the conventional MD simulations with a MC structure-acceptance judgment, which is based on the coincidence degree of small angle x-ray scattering (SAXS intensity profiles between the simulation structures and the target structure. We found that the hybrid simulations could significantly improve SE by making the top-ranked models much closer to the target structures both in the secondary and tertiary structures. Specifically, for the 20 mono-residue peptides, when the initial structures had the root-mean-squared deviation (RMSD from the target structure smaller than 7 Å, the hybrid MD-MC simulations afforded, on average, 0.83 Å and 1.73 Å in RMSD closer to the target than the parallel MD simulations at 310K and 370K, respectively. Meanwhile, the average SE values are also increased by 13.2% and 15.7%. The enrichment of sampling becomes more significant when the target states are gradually detectable in the MD-MC simulations in comparison with the parallel MD simulations, and provide >200% improvement in SE. We also performed a test of the hybrid MD-MC approach in the real protein system, the results showed that the SE for 3 out of 5 real proteins are improved. Overall, this work presents an efficient way of utilizing solution SAXS to improve protein structure prediction and refinement, as well as the generation of near native structures for function annotation.

  12. Particle-transport simulation with the Monte Carlo method

    International Nuclear Information System (INIS)

    Carter, L.L.; Cashwell, E.D.

    1975-01-01

    Attention is focused on the application of the Monte Carlo method to particle transport problems, with emphasis on neutron and photon transport. Topics covered include sampling methods, mathematical prescriptions for simulating particle transport, mechanics of simulating particle transport, neutron transport, and photon transport. A literature survey of 204 references is included. (GMT)

  13. Genetics Home Reference: McLeod neuroacanthocytosis syndrome

    Science.gov (United States)

    ... Castiglioni C, Oechsner M, Goebel HH, Heppner FL, Jung HH. McLeod myopathy revisited: more neurogenic and less ... 130(Pt 12):3285-96. Citation on PubMed Jung HH, Danek A, Frey BM. McLeod syndrome: a ...

  14. Natural tracer test simulation by stochastic particle tracking method

    International Nuclear Information System (INIS)

    Ackerer, P.; Mose, R.; Semra, K.

    1990-01-01

    Stochastic particle tracking methods are well adapted to 3D transport simulations where discretization requirements of other methods usually cannot be satisfied. They do need a very accurate approximation of the velocity field. The described code is based on the mixed hybrid finite element method (MHFEM) to calculated the piezometric and velocity field. The random-walk method is used to simulate mass transport. The main advantages of the MHFEM over FD or FE are the simultaneous calculation of pressure and velocity, which are considered as unknowns; the possibility of interpolating velocities everywhere; and the continuity of the normal component of the velocity vector from one element to another. For these reasons, the MHFEM is well adapted for particle tracking methods. After a general description of the numerical methods, the model is used to simulate the observations made during the Twin Lake Tracer Test in 1983. A good match is found between observed and simulated heads and concentrations. (Author) (12 refs., 4 figs.)

  15. Simulation of the space debris environment in LEO using a simplified approach

    Science.gov (United States)

    Kebschull, Christopher; Scheidemann, Philipp; Hesselbach, Sebastian; Radtke, Jonas; Braun, Vitali; Krag, H.; Stoll, Enrico

    2017-01-01

    Several numerical approaches exist to simulate the evolution of the space debris environment. These simulations usually rely on the propagation of a large population of objects in order to determine the collision probability for each object. Explosion and collision events are triggered randomly using a Monte-Carlo (MC) approach. So in many different scenarios different objects are fragmented and contribute to a different version of the space debris environment. The results of the single Monte-Carlo runs therefore represent the whole spectrum of possible evolutions of the space debris environment. For the comparison of different scenarios, in general the average of all MC runs together with its standard deviation is used. This method is computationally very expensive due to the propagation of thousands of objects over long timeframes and the application of the MC method. At the Institute of Space Systems (IRAS) a model capable of describing the evolution of the space debris environment has been developed and implemented. The model is based on source and sink mechanisms, where yearly launches as well as collisions and explosions are considered as sources. The natural decay and post mission disposal measures are the only sink mechanisms. This method reduces the computational costs tremendously. In order to achieve this benefit a few simplifications have been applied. The approach of the model partitions the Low Earth Orbit (LEO) region into altitude shells. Only two kinds of objects are considered, intact bodies and fragments, which are also divided into diameter bins. As an extension to a previously presented model the eccentricity has additionally been taken into account with 67 eccentricity bins. While a set of differential equations has been implemented in a generic manner, the Euler method was chosen to integrate the equations for a given time span. For this paper parameters have been derived so that the model is able to reflect the results of the numerical MC

  16. Radiation in Particle Simulations

    International Nuclear Information System (INIS)

    More, R.; Graziani, F.; Glosli, J.; Surh, M.

    2010-01-01

    Hot dense radiative (HDR) plasmas common to Inertial Confinement Fusion (ICF) and stellar interiors have high temperature (a few hundred eV to tens of keV), high density (tens to hundreds of g/cc) and high pressure (hundreds of megabars to thousands of gigabars). Typically, such plasmas undergo collisional, radiative, atomic and possibly thermonuclear processes. In order to describe HDR plasmas, computational physicists in ICF and astrophysics use atomic-scale microphysical models implemented in various simulation codes. Experimental validation of the models used to describe HDR plasmas are difficult to perform. Direct Numerical Simulation (DNS) of the many-body interactions of plasmas is a promising approach to model validation but, previous work either relies on the collisionless approximation or ignores radiation. We present four methods that attempt a new numerical simulation technique to address a currently unsolved problem: the extension of molecular dynamics to collisional plasmas including emission and absorption of radiation. The first method applies the Lienard-Weichert solution of Maxwell's equations for a classical particle whose motion is assumed to be known. The second method expands the electromagnetic field in normal modes (planewaves in a box with periodic boundary-conditions) and solves the equation for wave amplitudes coupled to the particle motion. The third method is a hybrid molecular dynamics/Monte Carlo (MD/MC) method which calculates radiation emitted or absorbed by electron-ion pairs during close collisions. The fourth method is a generalization of the third method to include small clusters of particles emitting radiation during close encounters: one electron simultaneously hitting two ions, two electrons simultaneously hitting one ion, etc. This approach is inspired by the virial expansion method of equilibrium statistical mechanics. Using a combination of these methods we believe it is possible to do atomic-scale particle simulations of

  17. Simulation teaching method in Engineering Optics

    Science.gov (United States)

    Lu, Qieni; Wang, Yi; Li, Hongbin

    2017-08-01

    We here introduce a pedagogical method of theoretical simulation as one major means of the teaching process of "Engineering Optics" in course quality improvement action plan (Qc) in our school. Students, in groups of three to five, complete simulations of interference, diffraction, electromagnetism and polarization of light; each student is evaluated and scored in light of his performance in the interviews between the teacher and the student, and each student can opt to be interviewed many times until he is satisfied with his score and learning. After three years of Qc practice, the remarkable teaching and learning effect is obatined. Such theoretical simulation experiment is a very valuable teaching method worthwhile for physical optics which is highly theoretical and abstruse. This teaching methodology works well in training students as to how to ask questions and how to solve problems, which can also stimulate their interest in research learning and their initiative to develop their self-confidence and sense of innovation.

  18. SU-G-BRC-10: Feasibility of a Web-Based Monte Carlo Simulation Tool for Dynamic Electron Arc Radiotherapy (DEAR)

    International Nuclear Information System (INIS)

    Rodrigues, A; Wu, Q; Sawkey, D

    2016-01-01

    Purpose: DEAR is a radiation therapy technique utilizing synchronized motion of gantry and couch during delivery to optimize dose distribution homogeneity and penumbra for treatment of superficial disease. Dose calculation for DEAR is not yet supported by commercial TPSs. The purpose of this study is to demonstrate the feasibility of using a web-based Monte Carlo (MC) simulation tool (VirtuaLinac) to calculate dose distributions for a DEAR delivery. Methods: MC simulations were run through VirtuaLinac, which is based on the GEANT4 platform. VirtuaLinac utilizes detailed linac head geometry and material models, validated phase space files, and a voxelized phantom. The input was expanded to include an XML file for simulation of varying mechanical axes as a function of MU. A DEAR XML plan was generated and used in the MC simulation and delivered on a TrueBeam in Developer Mode. Radiographic film wrapped on a cylindrical phantom (12.5 cm radius) measured dose at a depth of 1.5 cm and compared to the simulation results. Results: A DEAR plan was simulated using an energy of 6 MeV and a 3×10 cm 2 cut-out in a 15×15 cm 2 applicator for a delivery of a 90° arc. The resulting data were found to provide qualitative and quantitative evidence that the simulation platform could be used as the basis for DEAR dose calculations. The resulting unwrapped 2D dose distributions agreed well in the cross-plane direction along the arc, with field sizes of 18.4 and 18.2 cm and penumbrae of 1.9 and 2.0 cm for measurements and simulations, respectively. Conclusion: Preliminary feasibility of a DEAR delivery using a web-based MC simulation platform has been demonstrated. This tool will benefit treatment planning for DEAR as a benchmark for developing other model based algorithms, allowing efficient optimization of trajectories, and quality assurance of plans without the need for extensive measurements.

  19. SU-G-BRC-10: Feasibility of a Web-Based Monte Carlo Simulation Tool for Dynamic Electron Arc Radiotherapy (DEAR)

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, A; Wu, Q [Duke University Medical Center, Durham, NC (United States); Sawkey, D [Varian Medical Systems, Palo Alto, CA (United States)

    2016-06-15

    Purpose: DEAR is a radiation therapy technique utilizing synchronized motion of gantry and couch during delivery to optimize dose distribution homogeneity and penumbra for treatment of superficial disease. Dose calculation for DEAR is not yet supported by commercial TPSs. The purpose of this study is to demonstrate the feasibility of using a web-based Monte Carlo (MC) simulation tool (VirtuaLinac) to calculate dose distributions for a DEAR delivery. Methods: MC simulations were run through VirtuaLinac, which is based on the GEANT4 platform. VirtuaLinac utilizes detailed linac head geometry and material models, validated phase space files, and a voxelized phantom. The input was expanded to include an XML file for simulation of varying mechanical axes as a function of MU. A DEAR XML plan was generated and used in the MC simulation and delivered on a TrueBeam in Developer Mode. Radiographic film wrapped on a cylindrical phantom (12.5 cm radius) measured dose at a depth of 1.5 cm and compared to the simulation results. Results: A DEAR plan was simulated using an energy of 6 MeV and a 3×10 cm{sup 2} cut-out in a 15×15 cm{sup 2} applicator for a delivery of a 90° arc. The resulting data were found to provide qualitative and quantitative evidence that the simulation platform could be used as the basis for DEAR dose calculations. The resulting unwrapped 2D dose distributions agreed well in the cross-plane direction along the arc, with field sizes of 18.4 and 18.2 cm and penumbrae of 1.9 and 2.0 cm for measurements and simulations, respectively. Conclusion: Preliminary feasibility of a DEAR delivery using a web-based MC simulation platform has been demonstrated. This tool will benefit treatment planning for DEAR as a benchmark for developing other model based algorithms, allowing efficient optimization of trajectories, and quality assurance of plans without the need for extensive measurements.

  20. McMAC: Towards a MAC Protocol with Multi-Constrained QoS Provisioning for Diverse Traffic in Wireless Body Area Networks

    Directory of Open Access Journals (Sweden)

    Muhammad Mostafa Monowar

    2012-11-01

    Full Text Available The emergence of heterogeneous applications with diverse requirements forresource-constrained Wireless Body Area Networks (WBANs poses significant challengesfor provisioning Quality of Service (QoS with multi-constraints (delay and reliability whilepreserving energy efficiency. To address such challenges, this paper proposes McMAC,a MAC protocol with multi-constrained QoS provisioning for diverse traffic classes inWBANs. McMAC classifies traffic based on their multi-constrained QoS demands andintroduces a novel superframe structure based on the "transmit-whenever-appropriate"principle, which allows diverse periods for diverse traffic classes according to their respectiveQoS requirements. Furthermore, a novel emergency packet handling mechanism is proposedto ensure packet delivery with the least possible delay and the highest reliability. McMACis also modeled analytically, and extensive simulations were performed to evaluate itsperformance. The results reveal that McMAC achieves the desired delay and reliabilityguarantee according to the requirements of a particular traffic class while achieving energyefficiency.

  1. McMAC: towards a MAC protocol with multi-constrained QoS provisioning for diverse traffic in Wireless Body Area Networks.

    Science.gov (United States)

    Monowar, Muhammad Mostafa; Hassan, Mohammad Mehedi; Bajaber, Fuad; Al-Hussein, Musaed; Alamri, Atif

    2012-11-12

    The emergence of heterogeneous applications with diverse requirements for resource-constrained Wireless Body Area Networks (WBANs) poses significant challenges for provisioning Quality of Service (QoS) with multi-constraints (delay and reliability) while preserving energy efficiency. To address such challenges, this paper proposes McMAC,a MAC protocol with multi-constrained QoS provisioning for diverse traffic classes in WBANs. McMAC classifies traffic based on their multi-constrained QoS demands and introduces a novel superframe structure based on the "transmit-whenever-appropriate"principle, which allows diverse periods for diverse traffic classes according to their respective QoS requirements. Furthermore, a novel emergency packet handling mechanism is proposedto ensure packet delivery with the least possible delay and the highest reliability. McMAC is also modeled analytically, and extensive simulations were performed to evaluate its performance. The results reveal that McMAC achieves the desired delay and reliability guarantee according to the requirements of a particular traffic class while achieving energy efficiency.

  2. A reverse Monte Carlo method for deriving optical constants of solids from reflection electron energy-loss spectroscopy spectra

    International Nuclear Information System (INIS)

    Da, B.; Sun, Y.; Ding, Z. J.; Mao, S. F.; Zhang, Z. M.; Jin, H.; Yoshikawa, H.; Tanuma, S.

    2013-01-01

    A reverse Monte Carlo (RMC) method is developed to obtain the energy loss function (ELF) and optical constants from a measured reflection electron energy-loss spectroscopy (REELS) spectrum by an iterative Monte Carlo (MC) simulation procedure. The method combines the simulated annealing method, i.e., a Markov chain Monte Carlo (MCMC) sampling of oscillator parameters, surface and bulk excitation weighting factors, and band gap energy, with a conventional MC simulation of electron interaction with solids, which acts as a single step of MCMC sampling in this RMC method. To examine the reliability of this method, we have verified that the output data of the dielectric function are essentially independent of the initial values of the trial parameters, which is a basic property of a MCMC method. The optical constants derived for SiO 2 in the energy loss range of 8-90 eV are in good agreement with other available data, and relevant bulk ELFs are checked by oscillator strength-sum and perfect-screening-sum rules. Our results show that the dielectric function can be obtained by the RMC method even with a wide range of initial trial parameters. The RMC method is thus a general and effective method for determining the optical properties of solids from REELS measurements.

  3. Improving the Cost Efficiency and Readiness of MC-130 Aircrew Training: A Case Study

    Science.gov (United States)

    2015-01-01

    Air Force VBA Visual Basic for Applications 1 AFSOC/A3T, "MC-130 Aircrew Training," Air Force...aircrew members have access to a co- located flight simulator, the proportion of training that is accomplished at a temporary duty...coordinate to have access to the aerial refueling track which is basically the airspace used to conduct aerial refueling. Crewmembers must also be

  4. McDonaldization and Job Insecurity

    Directory of Open Access Journals (Sweden)

    Emeka W. Dumbili

    2013-06-01

    Full Text Available The article examines how and why the McDonaldization of banking system in Nigeria engenders job insecurity. This is imperative because it provides an explicit revelation of the root causes of job insecurity in the sector that other scholars have totally omitted. No Nigerian scholar has applied the thesis in relation to job insecurity, which is the major problem in Nigeria’s banking industry. The article based on the analysis of secondary data and observations, therefore, draws on McDonaldization thesis to examine the upsurge of rationalization in the sector since consolidation exercise began in 2005. The article argues that the sector’s rising rationalization and ensuing efficiency, calculability, predictability, and control are necessary. However, these have inevitably engendered job insecurity and its adverse consequences. Based on the critical analyses of available evidence, the article concludes that the best option is to commence resistance of the McDonaldization processes, especially those that replace human with nonhuman technology or make customers unpaid workers.

  5. Tracheal intubation with a flexible fibreoptic scope or the McGrath videolaryngoscope in simulated difficult airway scenarios

    DEFF Research Database (Denmark)

    Jepsen, Cecilie H; Gätke, Mona R; Thøgersen, Bente

    2014-01-01

    Grath videolaryngoscope and FFE. The participants then performed tracheal intubation on a SimMan manikin once with the McGrath videolaryngoscope and once with the FFE in three difficult airway scenarios: (1) pharyngeal obstruction; (2) pharyngeal obstruction and cervical rigidity; (3) tongue oedema. MAIN OUTCOME MEASURES...

  6. High viscosity fluid simulation using particle-based method

    KAUST Repository

    Chang, Yuanzhang

    2011-03-01

    We present a new particle-based method for high viscosity fluid simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke\\'s law, is included in the traditional Navier-Stokes equation to simulate the movements of the high viscosity fluids. Benefiting from the Lagrangian nature of Smoothed Particle Hydrodynamics method, large flow deformation can be well handled easily and naturally. In addition, in order to eliminate the particle deficiency problem near the boundary, ghost particles are employed to enforce the solid boundary condition. Compared with Finite Element Methods with complicated and time-consuming remeshing operations, our method is much more straightforward to implement. Moreover, our method doesn\\'t need to store and compare to an initial rest state. The experimental results show that the proposed method is effective and efficient to handle the movements of highly viscous flows, and a large variety of different kinds of fluid behaviors can be well simulated by adjusting just one parameter. © 2011 IEEE.

  7. Simulation of tunneling construction methods of the Cisumdawu toll road

    Science.gov (United States)

    Abduh, Muhamad; Sukardi, Sapto Nugroho; Ola, Muhammad Rusdian La; Ariesty, Anita; Wirahadikusumah, Reini D.

    2017-11-01

    Simulation can be used as a tool for planning and analysis of a construction method. Using simulation technique, a contractor could design optimally resources associated with a construction method and compare to other methods based on several criteria, such as productivity, waste, and cost. This paper discusses the use of simulation using Norwegian Method of Tunneling (NMT) for a 472-meter tunneling work in the Cisumdawu Toll Road project. Primary and secondary data were collected to provide useful information for simulation as well as problems that may be faced by the contractor. The method was modelled using the CYCLONE and then simulated using the WebCYCLONE. The simulation could show the duration of the project from the duration model of each work tasks which based on literature review, machine productivity, and several assumptions. The results of simulation could also show the total cost of the project that was modeled based on journal construction & building unit cost and online websites of local and international suppliers. The analysis of the advantages and disadvantages of the method was conducted based on its, wastes, and cost. The simulation concluded the total cost of this operation is about Rp. 900,437,004,599 and the total duration of the tunneling operation is 653 days. The results of the simulation will be used for a recommendation to the contractor before the implementation of the already selected tunneling operation.

  8. Henry P. McKean Jr. selecta

    CERN Document Server

    Moerbeke, Pierre; Moll, Victor

    2015-01-01

    This volume presents a selection of papers by Henry P. McKean, which illustrate the various areas in mathematics in which he has made seminal contributions. Topics covered include probability theory, integrable systems, geometry and financial mathematics. Each paper represents a contribution by Prof. McKean, either alone or together with other researchers, that has had a profound influence in the respective area.

  9. Monte Carlo simulation of a statistical mechanical model of multiple protein sequence alignment.

    Science.gov (United States)

    Kinjo, Akira R

    2017-01-01

    A grand canonical Monte Carlo (MC) algorithm is presented for studying the lattice gas model (LGM) of multiple protein sequence alignment, which coherently combines long-range interactions and variable-length insertions. MC simulations are used for both parameter optimization of the model and production runs to explore the sequence subspace around a given protein family. In this Note, I describe the details of the MC algorithm as well as some preliminary results of MC simulations with various temperatures and chemical potentials, and compare them with the mean-field approximation. The existence of a two-state transition in the sequence space is suggested for the SH3 domain family, and inappropriateness of the mean-field approximation for the LGM is demonstrated.

  10. Use of the McQuarrie equation for the computation of shear viscosity via equilibrium molecular dynamics

    International Nuclear Information System (INIS)

    Chialvo, A.A.; Debenedetti, P.G.

    1991-01-01

    To date, the calculation of shear viscosity for soft-core fluids via equilibrium molecular dynamics has been done almost exclusively using the Green-Kubo formalism. The alternative mean-squared displacement approach has not been used, except for hard-sphere fluids, in which case the expression proposed by Helfand [Phys. Rev. 119, 1 (1960)] has invariably been selected. When written in the form given by McQuarrie [Statistical Mechanics (Harper ampersand Row, New York, 1976), Chap. 21], however, the mean-squared displacement approach offers significant computational advantages over both its Green-Kubo and Helfand counterparts. In order to achieve comparable statistical significance, the number of experiments needed when using the Green-Kubo or Helfand formalisms is more than an order of magnitude higher than for the McQuarrie expression. For pairwise-additive systems with zero linear momentum, the McQuarrie method yields frame-independent shear viscosities. The hitherto unexplored McQuarrie implementation of the mean-squared displacement approach to shear-viscosity calculation thus appears superior to alternative methods currently in use

  11. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    International Nuclear Information System (INIS)

    Liu, Yubin; Yuan, Zhen; Jiang, Huabei

    2016-01-01

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects with different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular, their

  12. Factorization method for simulating QCD at finite density

    International Nuclear Information System (INIS)

    Nishimura, Jun

    2003-01-01

    We propose a new method for simulating QCD at finite density. The method is based on a general factorization property of distribution functions of observables, and it is therefore applicable to any system with a complex action. The so-called overlap problem is completely eliminated by the use of constrained simulations. We test this method in a Random Matrix Theory for finite density QCD, where we are able to reproduce the exact results for the quark number density. (author)

  13. Monte Carlo simulations to replace film dosimetry in IMRT verification

    International Nuclear Information System (INIS)

    Goetzfried, Thomas; Trautwein, Marius; Koelbi, Oliver; Bogner, Ludwig; Rickhey, Mark

    2011-01-01

    Patient-specific verification of intensity-modulated radiation therapy (IMRT) plans can be done by dosimetric measurements or by independent dose or monitor unit calculations. The aim of this study was the clinical evaluation of IMRT verification based on a fast Monte Carlo (MC) program with regard to possible benefits compared to commonly used film dosimetry. 25 head-and-neck IMRT plans were recalculated by a pencil beam based treatment planning system (TPS) using an appropriate quality assurance (QA) phantom. All plans were verified both by film and diode dosimetry and compared to MC simulations. The irradiated films, the results of diode measurements and the computed dose distributions were evaluated, and the data were compared on the basis of gamma maps and dose-difference histograms. Average deviations in the high-dose region between diode measurements and point dose calculations performed with the TPS and MC program were 0.7 ± 2.7% and 1.2 ± 3.1%, respectively. For film measurements, the mean gamma values with 3% dose difference and 3 mm distance-to-agreement were 0.74 ± 0.28 (TPS as reference) with dose deviations up to 10%. Corresponding values were significantly reduced to 0.34 ± 0.09 for MC dose calculation. The total time needed for both verification procedures is comparable, however, by far less labor intensive in the case of MC simulations. The presented study showed that independent dose calculation verification of IMRT plans with a fast MC program has the potential to eclipse film dosimetry more and more in the near future. Thus, the linac-specific QA part will necessarily become more important. In combination with MC simulations and due to the simple set-up, point-dose measurements for dosimetric plausibility checks are recommended at least in the IMRT introduction phase. (orig.)

  14. A new method of Debye-Scherrer pattern integration on two-dimensional detectors, demonstrated for the new structure powder diffractometer (SPODI) at the FRM-II in Garching

    CERN Document Server

    Elf, F; Artus, G R J; Roth, S

    2002-01-01

    The expected diffraction patterns of the new powder diffractometer SPODI, currently under construction at the FRM-II in Garching, will be smeared Debye-Scherrer rings as depicted by Monte Carlo (MC) simulations. To overcome this disadvantage, a concept based on the combination of MC simulations and empirical approximation methods is developed to reverse the smearing by deconvolution and then summing up along the rings, including corrections for different arc lengths, resulting in conventional one-dimensional diffraction patterns suitable for Rietveld-refinement programs without further processing. (orig.)

  15. Spectral Methods in Numerical Plasma Simulation

    DEFF Research Database (Denmark)

    Coutsias, E.A.; Hansen, F.R.; Huld, T.

    1989-01-01

    An introduction is given to the use of spectral methods in numerical plasma simulation. As examples of the use of spectral methods, solutions to the two-dimensional Euler equations in both a simple, doubly periodic region, and on an annulus will be shown. In the first case, the solution is expanded...

  16. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    Science.gov (United States)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  17. Mean transverse momenta correlations in hadron-hadron collisions in MC toy model with repulsing strings

    International Nuclear Information System (INIS)

    Altsybeev, Igor

    2016-01-01

    In the present work, Monte-Carlo toy model with repulsing quark-gluon strings in hadron-hadron collisions is described. String repulsion creates transverse boosts for the string decay products, giving modifications of observables. As an example, long-range correlations between mean transverse momenta of particles in two observation windows are studied in MC toy simulation of the heavy-ion collisions

  18. NTS MC and A History

    International Nuclear Information System (INIS)

    Mary Alice Price; Kim Young

    2008-01-01

    Within the past three and a half years, the Nevada Test Site (NTS) has progressed from a Category IV to a Category I nuclear material facility. In accordance with direction from the U.S. Department of Energy (DOE) Secretary and National Nuclear Security Administration (NNSA) Administrator, NTS received shipments of large quantities of special nuclear material from Los Alamos National Laboratory (LANL) and other sites in the DOE complex. December 2004 was the first occurrence of Category I material at the NTS, with the exception of two weeks of sub-critical underground testing in 2001, since 1992. The Material Control and Accountability (MC and A) program was originally a jointlab effort by LANL, Lawrence Livermore National Laboratory, and Bechtel Nevada, but in March 2006 the NNSA Nevada Site Office appointed the NTS Management and Operations contractor with sole responsibility. This paper will discuss the process and steps taken to transition the NTS MC and A program from multiple organizations to a single entity and from a Category IV to a Category I program. This transition flourished as MC and A progressed from the 2004 Office of Assessment (OA) rating of 'Significant Weakness' to the 2007 OA assessment rating of 'Effective Performance'. The paper will provide timelines, funding and staffing issues, OA assessment findings and corrective actions, and future expectations. The process has been challenging, but MC and A's innovative responses to the challenges have been very successful

  19. Method of simulating dose reduction for digital radiographic systems

    International Nuclear Information System (INIS)

    Baath, M.; Haakansson, M.; Tingberg, A.; Maansson, L. G.

    2005-01-01

    The optimisation of image quality vs. radiation dose is an important task in medical imaging. To obtain maximum validity of the optimisation, it must be based on clinical images. Images at different dose levels can then either be obtained by collecting patient images at the different dose levels sought to investigate - including additional exposures and permission from an ethical committee - or by manipulating images to simulate different dose levels. The aim of the present work was to develop a method of simulating dose reduction for digital radiographic systems. The method uses information about the detective quantum efficiency and noise power spectrum at the original and simulated dose levels to create an image containing filtered noise. When added to the original image this results in an image with noise which, in terms of frequency content, agrees with the noise present in an image collected at the simulated dose level. To increase the validity, the method takes local dose variations in the original image into account. The method was tested on a computed radiography system and was shown to produce images with noise behaviour similar to that of images actually collected at the simulated dose levels. The method can, therefore, be used to modify an image collected at one dose level so that it simulates an image of the same object collected at any lower dose level. (authors)

  20. Kinetic Monte Carlo simulation of the efficiency roll-off, emission color, and degradation of organic light-emitting diodes

    NARCIS (Netherlands)

    Coehoorn, R.; van Eersel, H.; Bobbert, P.A.; Janssen, R.A.J.

    2015-01-01

    The performance of Organic Light Emitting Diodes (OLEDs) is determined by a complex interplay of the charge transport and excitonic processes in the active layer stack. We have developed a three-dimensional kinetic Monte Carlo (kMC) OLED simulation method which includes all these processes in an

  1. Quality control of MC and A system and integrated safeguards

    International Nuclear Information System (INIS)

    Osabe, Takeshi

    2000-01-01

    In the integrated safeguards regime, co-operation with SSAC is a vital element to achieve efficiency of safeguards implementation while maintaining the effectiveness. However, the degree of co-operation fully depends upon the credibility, technical capability and the transparency of SSAC. Since the credibility of SSAC (States' System of Accounting for and Control of Nuclear Materials) depends heavily on effectiveness of facility operator's Material Control and Accounting (MC and A) practice, some measures to provide continuous assurance of the function and effectiveness of the system such as quality assurance program including periodical system audit (diagnostic) function ought to be established. This paper discusses quality assurance program for facility level MC and A including audit (diagnostic) method to maintain continuous assurance of the effectiveness. (author)

  2. Manual del McVCO 1999

    Science.gov (United States)

    McChesney, P.J.

    1999-01-01

    El McVCO es un generador de frecuencias basado en un microcontrolador que reemplaza al oscilador controlado por voltaje (VCO) utilizado en telemetría analógica de datos sísmicas. Acepta señales de baja potencia desde un sismómetro y produce una señal subportadora modulada en frecuencia adecuada para enlaces telefónicos o vía radio a un lugar remoto de recolección de datos. La frecuencia de la subportadora y la ganancia pueden ser seleccionadas mediante un interruptor. Tiene la opción de poder operar con dos canales para la observación con ganancia alta y baja. El McVCO fue diseñado con el propósito de mejorar la telemetría analógica de las señales dentro de la Pacific Northwest Seismograph Network (PNSN) (Red Sismográfica del Noroeste del Pacífico). Su desarrollo recibió el respaldo del Programa de Geofísica de la Universidad de Washington y del "Volcano Hazards and Earthquake Hazards programs of the United States Geological Survey (USGS) (Programa de Investigaciones de Riesgos Volcánicos y Programa de Investigaciones de Riesgos Sísmicos de los EEUU). Cientos de instrumentos se han construido e instalado. Además de utilizarlo el PNSN, el McVCO es usado por el Observatorio Vulcanológico de Alaska para monitorear los volcanes aleutianos y por el USGS Volcano Disaster Assistance Program (Programa de Ayuda en las Catástrofes Volcánicas del USGS) para responder a crisis volcánicas en otros países. Este manual cubre el funcionamiento del McVCO, es una referencia técnica para aquellos que necesitan saber con más detalle cómo funciona el McVCO, y cubre una serie de temas que requieren un trato explícito o que derivan del despliegue del instrumento.

  3. J.B. McLachlan: a biography

    Energy Technology Data Exchange (ETDEWEB)

    Frank, D.

    1999-07-01

    This social history and biography of James Bryson McLaughlin (1869-1916) describes McLaughlin's leadership as an educator and instigator in organizing Nova Scotia's coal miners during the labour wars in the 1920s. McLaughlin's background and childhood, education, reputation, religion, family life, health, and death are described. Included are descriptions of the life of coal miners in Cape Breton, radical left politics in Canada and the organizers involved, the political economy of the coal industry, child labour, churches, coal markets and prices, company towns and housing, mining disasters and fatalities, elections, First World War efforts, the depression, immigrants, and strikes. The labour organizations, companies, churches, and politicians involved in the struggles for union acceptance are discussed. 872 refs., 7 figs., 24 photos.

  4. Influence of radioactive sources discretization in the Monte Carlo computational simulations of brachytherapy procedures: a case study on the procedures for treatment of prostate cancer

    International Nuclear Information System (INIS)

    Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson; Costa, Kleber Souza Silva; Lima, Fernando Roberto de Andrade

    2011-01-01

    Radiotherapy computational simulation procedures using Monte Carlo (MC) methods have shown to be increasingly important to the improvement of cancer fighting strategies. One of the biases in this practice is the discretization of the radioactive source in brachytherapy simulations, which often do not match with a real situation. This study had the aim to identify and to measure the influence of radioactive sources discretization in brachytherapy MC simulations when compared to those that do not present discretization, using prostate brachytherapy with Iodine-125 radionuclide as model. Simulations were carried out with 108 events with both types of sources to compare them using EGSnrc code associated to MASH phantom in orthostatic and supine positions with some anatomic adaptations. Significant alterations were found, especially regarding bladder, rectum and the prostate itself. It can be concluded that there is a need to discretized sources in brachytherapy simulations to ensure its representativeness. (author)

  5. Adaptive implicit method for thermal compositional reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, A.; Tchelepi, H.A. [Society of Petroleum Engineers, Richardson, TX (United States)]|[Stanford Univ., Palo Alto (United States)

    2008-10-15

    As the global demand for oil increases, thermal enhanced oil recovery techniques are becoming increasingly important. Numerical reservoir simulation of thermal methods such as steam assisted gravity drainage (SAGD) is complex and requires a solution of nonlinear mass and energy conservation equations on a fine reservoir grid. The most currently used technique for solving these equations is the fully IMplicit (FIM) method which is unconditionally stable, allowing for large timesteps in simulation. However, it is computationally expensive. On the other hand, the method known as IMplicit pressure explicit saturations, temperature and compositions (IMPEST) is computationally inexpensive, but it is only conditionally stable and restricts the timestep size. To improve the balance between the timestep size and computational cost, the thermal adaptive IMplicit (TAIM) method uses stability criteria and a switching algorithm, where some simulation variables such as pressure, saturations, temperature, compositions are treated implicitly while others are treated with explicit schemes. This presentation described ongoing research on TAIM with particular reference to thermal displacement processes such as the stability criteria that dictate the maximum allowed timestep size for simulation based on the von Neumann linear stability analysis method; the switching algorithm that adapts labeling of reservoir variables as implicit or explicit as a function of space and time; and, complex physical behaviors such as heat and fluid convection, thermal conduction and compressibility. Key numerical results obtained by enhancing Stanford's General Purpose Research Simulator (GPRS) were also presented along with a list of research challenges. 14 refs., 2 tabs., 11 figs., 1 appendix.

  6. Amorphous silicon EPID calibration for dosimetric applications: comparison of a method based on Monte Carlo prediction of response with existing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Parent, L [Joint Department of Physics, Institute of Cancer Research and Royal Marsden NHS Foundation Trust, Sutton (United Kingdom); Fielding, A L [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane (Australia); Dance, D R [Joint Department of Physics, Institute of Cancer Research and Royal Marsden NHS Foundation Trust, London (United Kingdom); Seco, J [Department of Radiation Oncology, Francis Burr Proton Therapy Center, Massachusetts General Hospital, Harvard Medical School, Boston (United States); Evans, P M [Joint Department of Physics, Institute of Cancer Research and Royal Marsden NHS Foundation Trust, Sutton (United Kingdom)

    2007-07-21

    For EPID dosimetry, the calibration should ensure that all pixels have a similar response to a given irradiation. A calibration method (MC), using an analytical fit of a Monte Carlo simulated flood field EPID image to correct for the flood field image pixel intensity shape, was proposed. It was compared with the standard flood field calibration (FF), with the use of a water slab placed in the beam to flatten the flood field (WS) and with a multiple field calibration where the EPID was irradiated with a fixed 10 x 10 field for 16 different positions (MF). The EPID was used in its normal configuration (clinical setup) and with an additional 3 mm copper slab (modified setup). Beam asymmetry measured with a diode array was taken into account in MC and WS methods. For both setups, the MC method provided pixel sensitivity values within 3% of those obtained with the MF and WS methods (mean difference <1%, standard deviation <2%). The difference of pixel sensitivity between MC and FF methods was up to 12.2% (clinical setup) and 11.8% (modified setup). MC calibration provided images of open fields (5 x 5 to 20 x 20 cm{sup 2}) and IMRT fields to within 3% of that obtained with WS and MF calibrations while differences with images calibrated with the FF method for fields larger than 10 x 10 cm{sup 2} were up to 8%. MC, WS and MF methods all provided a major improvement on the FF method. Advantages and drawbacks of each method were reviewed.

  7. SU-E-T-558: Monte Carlo Photon Transport Simulations On GPU with Quadric Geometry

    International Nuclear Information System (INIS)

    Chi, Y; Tian, Z; Jiang, S; Jia, X

    2015-01-01

    Purpose: Monte Carlo simulation on GPU has experienced rapid advancements over the past a few years and tremendous accelerations have been achieved. Yet existing packages were developed only in voxelized geometry. In some applications, e.g. radioactive seed modeling, simulations in more complicated geometry are needed. This abstract reports our initial efforts towards developing a quadric geometry module aiming at expanding the application scope of GPU-based MC simulations. Methods: We defined the simulation geometry consisting of a number of homogeneous bodies, each specified by its material composition and limiting surfaces characterized by quadric functions. A tree data structure was utilized to define geometric relationship between different bodies. We modified our GPU-based photon MC transport package to incorporate this geometry. Specifically, geometry parameters were loaded into GPU’s shared memory for fast access. Geometry functions were rewritten to enable the identification of the body that contains the current particle location via a fast searching algorithm based on the tree data structure. Results: We tested our package in an example problem of HDR-brachytherapy dose calculation for shielded cylinder. The dose under the quadric geometry and that under the voxelized geometry agreed in 94.2% of total voxels within 20% isodose line based on a statistical t-test (95% confidence level), where the reference dose was defined to be the one at 0.5cm away from the cylinder surface. It took 243sec to transport 100million source photons under this quadric geometry on an NVidia Titan GPU card. Compared with simulation time of 99.6sec in the voxelized geometry, including quadric geometry reduced efficiency due to the complicated geometry-related computations. Conclusion: Our GPU-based MC package has been extended to support photon transport simulation in quadric geometry. Satisfactory accuracy was observed with a reduced efficiency. Developments for charged

  8. SU-E-T-558: Monte Carlo Photon Transport Simulations On GPU with Quadric Geometry

    Energy Technology Data Exchange (ETDEWEB)

    Chi, Y; Tian, Z; Jiang, S; Jia, X [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2015-06-15

    Purpose: Monte Carlo simulation on GPU has experienced rapid advancements over the past a few years and tremendous accelerations have been achieved. Yet existing packages were developed only in voxelized geometry. In some applications, e.g. radioactive seed modeling, simulations in more complicated geometry are needed. This abstract reports our initial efforts towards developing a quadric geometry module aiming at expanding the application scope of GPU-based MC simulations. Methods: We defined the simulation geometry consisting of a number of homogeneous bodies, each specified by its material composition and limiting surfaces characterized by quadric functions. A tree data structure was utilized to define geometric relationship between different bodies. We modified our GPU-based photon MC transport package to incorporate this geometry. Specifically, geometry parameters were loaded into GPU’s shared memory for fast access. Geometry functions were rewritten to enable the identification of the body that contains the current particle location via a fast searching algorithm based on the tree data structure. Results: We tested our package in an example problem of HDR-brachytherapy dose calculation for shielded cylinder. The dose under the quadric geometry and that under the voxelized geometry agreed in 94.2% of total voxels within 20% isodose line based on a statistical t-test (95% confidence level), where the reference dose was defined to be the one at 0.5cm away from the cylinder surface. It took 243sec to transport 100million source photons under this quadric geometry on an NVidia Titan GPU card. Compared with simulation time of 99.6sec in the voxelized geometry, including quadric geometry reduced efficiency due to the complicated geometry-related computations. Conclusion: Our GPU-based MC package has been extended to support photon transport simulation in quadric geometry. Satisfactory accuracy was observed with a reduced efficiency. Developments for charged

  9. Constraint methods that accelerate free-energy simulations of biomolecules.

    Science.gov (United States)

    Perez, Alberto; MacCallum, Justin L; Coutsias, Evangelos A; Dill, Ken A

    2015-12-28

    Atomistic molecular dynamics simulations of biomolecules are critical for generating narratives about biological mechanisms. The power of atomistic simulations is that these are physics-based methods that satisfy Boltzmann's law, so they can be used to compute populations, dynamics, and mechanisms. But physical simulations are computationally intensive and do not scale well to the sizes of many important biomolecules. One way to speed up physical simulations is by coarse-graining the potential function. Another way is to harness structural knowledge, often by imposing spring-like restraints. But harnessing external knowledge in physical simulations is problematic because knowledge, data, or hunches have errors, noise, and combinatoric uncertainties. Here, we review recent principled methods for imposing restraints to speed up physics-based molecular simulations that promise to scale to larger biomolecules and motions.

  10. Toward high-efficiency and detailed Monte Carlo simulation study of the granular flow spallation target

    Science.gov (United States)

    Cai, Han-Jie; Zhang, Zhi-Lei; Fu, Fen; Li, Jian-Yang; Zhang, Xun-Chao; Zhang, Ya-Ling; Yan, Xue-Song; Lin, Ping; Xv, Jian-Ya; Yang, Lei

    2018-02-01

    The dense granular flow spallation target is a new target concept chosen for the Accelerator-Driven Subcritical (ADS) project in China. For the R&D of this kind of target concept, a dedicated Monte Carlo (MC) program named GMT was developed to perform the simulation study of the beam-target interaction. Owing to the complexities of the target geometry, the computational cost of the MC simulation of particle tracks is highly expensive. Thus, improvement of computational efficiency will be essential for the detailed MC simulation studies of the dense granular target. Here we present the special design of the GMT program and its high efficiency performance. In addition, the speedup potential of the GPU-accelerated spallation models is discussed.

  11. Impact of Different Spreading Codes Using FEC on DWT Based MC-CDMA System

    OpenAIRE

    Masum, Saleh; Kabir, M. Hasnat; Islam, Md. Matiqul; Shams, Rifat Ara; Ullah, Shaikh Enayet

    2012-01-01

    The effect of different spreading codes in DWT based MC-CDMA wireless communication system is investigated. In this paper, we present the Bit Error Rate (BER) performance of different spreading codes (Walsh-Hadamard code, Orthogonal gold code and Golay complementary sequences) using Forward Error Correction (FEC) of the proposed system. The data is analyzed and is compared among different spreading codes in both coded and uncoded cases. It is found via computer simulation that the performance...

  12. Application of the MCNPX-McStas interface for shielding calculations and guide design at ESS

    DEFF Research Database (Denmark)

    Klinkby, Esben Bryndt; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær

    2014-01-01

    Recently, an interface between the Monte Carlo code MCNPX and the neutron ray-tracing code MCNPX was developed [1, 2]. Based on the expected neutronic performance and guide geometries relevant for the ESS, the combined MCNPX-McStas code is used to calculate dose rates along neutron beam guides......, and by using newly developed event logging capability, the neutron state parameters corresponding to un-reflected neutrons are recorded at each scattering. This information is handed back to MCNPX where it serves as neutron source input for a second MCNPX simulation. This simulation enables calculation of dose...

  13. Osteogenic gene expression of murine osteoblastic (MC3T3-E1) cells under cyclic tension

    International Nuclear Information System (INIS)

    Kao, C T; Chen, C C; Cheong, U-I; Liu, S L; Huang, T H

    2014-01-01

    Low-level laser therapy (LLLT) can promote cell proliferation. The remodeling ability of the tension side of orthodontic teeth affects post-orthodontic stability. The purpose of the present study was to investigate the osteogenic effects of LLLT on osteoblast-like cells treated with a simulated tension system that provides a mechanical tension regimen. Murine osteoblastic (MC3T3-E1) cells were cultured in a Flexcell strain unit with programmed loads of 12% elongation at a frequency of 0.5 Hz for 24 and 48 h. The cultured cells were treated with a low-level diode laser using powers of 5 J and 10 J. The proliferation of MC3T3-E1 cells was determined using the Alamar Blue assay. The expression of osteogenic genes (type I collagen (Col-1), osteopontin (OPN), osteocalcin (OC), osteoprotegerin (OPG), receptor activator of nuclear factor kappa B ligand (RANKL), bone morphologic protein (BMP-2), and bone morphologic protein (BMP-4)) in MC3T3-E1 cells was analyzed using reverse transcription polymerase chain reaction (RT-PCR). The data were analyzed using one-way analysis of variance. The proliferation rate of tension-cultured MC3T3-E1 cells under 5 J and 10 J LLLT increased compared with that of the control group (p < 0.05). Prominent mineralization of the MC3T3-E1 cells was visible using a von Kossa stain in the 5 J LLLT group. Osteogenic genes (Col-1, OC, OPG and BMP-2) were significantly expressed in the MC3T3-E1 cells treated with 5 J and 10 J LLLT (p < 0.05). LLLT in tension-cultured MC3T3-E1 cells showed synergistic osteogenic effects, including increases in cell proliferation and Col-1, OPN, OC, OPG and BMP-2 gene expression. LLLT might be beneficial for bone remodeling on the tension side of orthodontics. (paper)

  14. CTMCONTROL: Addressing the MC/DC Objective for Safety-Critical Automotive Software

    OpenAIRE

    Mjeda , Anila; Hinchey , Mike

    2013-01-01

    International audience; We propose a method tailored to the requirements of safety-critical embedded automotive software, named CTMCONTROL. CTMCONTROL has a par-ticular focus on the specification-based control logic of the system under test and offers improvements in testing coverage metrics over a classic method which is routinely used in industry. The proposed method targets the Modified Condition/ Decision Coverage (MC/DC) objective for automotive safety-critical software. CTMCONTROL is va...

  15. Small photon beam measurements using radiochromic film and Monte Carlo simulations in a water phantom

    International Nuclear Information System (INIS)

    Garcia-Garduno, Olivia A.; Larraga-Gutierrez, Jose M.; Rodriguez-Villafuerte, Mercedes; Martinez-Davalos, Arnulfo; Celis, Miguel A.

    2010-01-01

    This work reports the use of both GafChromic EBT film immersed in a water phantom and Monte Carlo (MC) simulations for small photon beam stereotactic radiosurgery dosimetry. Circularly collimated photon beams with diameters in the 4-20 mm range of a dedicated 6 MV linear accelerator (Novalis (registered) , BrainLAB, Germany) were used to perform off-axis ratios, tissue maximum ratios and total scatter factors measurements, and MC simulations. GafChromic EBT film data show an excellent agreement with MC results (<2.7%) for all measured quantities.

  16. Monte Carlo simulation of beam characteristics from small fields based on TrueBeam flattening-filter-free mode

    International Nuclear Information System (INIS)

    Feng, Zhongsu; Yue, Haizhen; Zhang, Yibao; Wu, Hao; Cheng, Jinsheng; Su, Xu

    2016-01-01

    Through the Monte Carlo (MC) simulation of 6 and 10 MV flattening-filter-free (FFF) beams from Varian TrueBeam accelerator, this study aims to find the best incident electron distribution for further studying the small field characteristics of these beams. By incorporating the training materials of Varian on the geometry and material parameters of TrueBeam Linac head, the 6 and 10 MV FFF beams were modelled using the BEAMnrc and DOSXYZnrc codes, where the percentage depth doses (PDDs) and the off-axis ratios (OARs) curves of fields ranging from 4 × 4 to 40 × 40 cm 2 were simulated for both energies by adjusting the incident beam energy, radial intensity distribution and angular spread, respectively. The beam quality and relative output factor (ROF) were calculated. The simulations and measurements were compared using Gamma analysis method provided by Verisoft program (PTW, Freiburg, Germany), based on which the optimal MC model input parameters were selected and were further used to investigate the beam characteristics of small fields. The Full Width Half Maximum (FWHM), mono-energetic energy and angular spread of the resultant incident Gaussian radial intensity electron distribution were 0.75 mm, 6.1 MeV and 0.9° for the nominal 6 MV FFF beam, and 0.7 mm, 10.8 MeV and 0.3° for the nominal 10 MV FFF beam respectively. The simulation was mostly comparable to the measurement. Gamma criteria of 1 mm/1 % (local dose) can be met by all PDDs of fields larger than 1 × 1 cm 2 , and by all OARs of no larger than 20 × 20 cm 2 , otherwise criteria of 1 mm/2 % can be fulfilled. Our MC simulated ROFs agreed well with the measured ROFs of various field sizes (the discrepancies were less than 1 %), except for the 1 × 1 cm 2 field. The MC simulation agrees well with the measurement and the proposed model parameters can be clinically used for further dosimetric studies of 6 and 10 MV FFF beams

  17. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  18. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    International Nuclear Information System (INIS)

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  19. Simulated annealing method for electronic circuits design: adaptation and comparison with other optimization methods

    International Nuclear Information System (INIS)

    Berthiau, G.

    1995-10-01

    The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. Finally, our simulated annealing program

  20. Carbohydrate- and protein-rich diets in McArdle disease: Effects on exercise capacity

    DEFF Research Database (Denmark)

    Andersen, S.T.; Vissing, J.

    2008-01-01

    metabolism during exercise, which questions the effect of protein in McArdle disease. METHODS: In a crossover, open design, we studied 7 patients with McArdle disease, who were randomised to follow either a carbohydrate- or protein-rich diet for three days before testing. Caloric intake on each diet...... was identical, and was adjusted to the subject's weight, age and sex. After each diet, exercise tolerance and maximal work capacity were tested on a bicycle ergometer, using a constant workload for 15 minutes followed by an incremental workload to exhaustion. RESULTS: During the constant workload, heart rate...... capacity and exercise tolerance to submaximal workloads by maintaining a diet high in carbohydrate instead of protein. The carbohydrate diet not only improves tolerance to every-day activities, but will likely also help to prevent exercise-induced episodes of muscle injury in McArdle disease Udgivelsesdato...

  1. The influence of room temperature on Mg isotope measurements by MC-ICP-MS.

    Science.gov (United States)

    Zhang, Xing-Chao; Zhang, An-Yu; Zhang, Zhao-Feng; Huang, Fang; Yu, Hui-Min

    2018-03-24

    We observed that the accuracy and precision of magnesium (Mg) isotope analyses could be affected if the room temperature oscillated during measurements. To achieve high quality Mg isotopic data, it is critical to evaluate how the unstable room temperature affects Mg isotope measurements by multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). We measured the Mg isotopes for the reference material DSM-3 using MC-ICP-MS under oscillating room temperatures in spring. For a comparison, we also measured the Mg isotopes under stable room temperatures, which was achieved by the installation of an improved temperature control system in the laboratory. The δ 26 Mg values measured under oscillating room temperatures have a larger deviation (δ 26 Mg from -0.09 to 0.08‰, with average δ 26 Mg = 0.00 ± 0.08 ‰) than those measured under a stable room temperature (δ 26 Mg from -0.03 to 0.03‰, with average δ 26 Mg = 0.00 ± 0.02 ‰) using the same MC-ICP-MS system. The room temperature variation can influence the stability of MC-ICP-MS. Therefore, it is critical to keep the room temperature stable to acquire high precise and accurate isotopic data when using MC-ICP-MS, especially when using the sample-standard bracketing (SSB) correction method. This article is protected by copyright. All rights reserved.

  2. Hybrid Method Simulation of Slender Marine Structures

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye

    This present thesis consists of an extended summary and five appended papers concerning various aspects of the implementation of a hybrid method which combines classical simulation methods and artificial neural networks. The thesis covers three main topics. Common for all these topics...... only recognize patterns similar to those comprised in the data used to train the network. Fatigue life evaluation of marine structures often considers simulations of more than a hundred different sea states. Hence, in order for this method to be useful, the training data must be arranged so...... that a single neural network can cover all relevant sea states. The applicability and performance of the present hybrid method is demonstrated on a numerical model of a mooring line attached to a floating offshore platform. The second part of the thesis demonstrates how sequential neural networks can be used...

  3. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    Science.gov (United States)

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  4. Fast Food McDonald's China Fix

    Institute of Scientific and Technical Information of China (English)

    DAVID HENDRICKSON

    2006-01-01

    @@ Since the opening of its first outlet 16 years ago, McDonald's China operation has on many levels proven enormously successful.Home to more than 750 locations nationwide, the Middle Kingdom today ranks as one of McDonald's ten largest markets,with returns hovering in doubles digits and raking in billions annually. As lucrative as it may be, however, China has nonetheless developed into a relative sore spot for the world's leading fast food giant.

  5. DRK methods for time-domain oscillator simulation

    NARCIS (Netherlands)

    Sevat, M.F.; Houben, S.H.M.J.; Maten, ter E.J.W.; Di Bucchianico, A.; Mattheij, R.M.M.; Peletier, M.A.

    2006-01-01

    This paper presents a new Runge-Kutta type integration method that is well-suited for time-domain simulation of oscillators. A unique property of the new method is that its damping characteristics can be controlled by a continuous parameter.

  6. Development of water movement model as a module of moisture content simulation in static pile composting.

    Science.gov (United States)

    Seng, Bunrith; Kaneko, Hidehiro; Hirayama, Kimiaki; Katayama-Hirayama, Keiko

    2012-01-01

    This paper presents a mathematical model of vertical water movement and a performance evaluation of the model in static pile composting operated with neither air supply nor turning. The vertical moisture content (MC) model was developed with consideration of evaporation (internal and external evaporation), diffusion (liquid and vapour diffusion) and percolation, whereas additional water from substrate decomposition and irrigation was not taken into account. The evaporation term in the model was established on the basis of reference evaporation of the materials at known temperature, MC and relative humidity of the air. Diffusion of water vapour was estimated as functions of relative humidity and temperature, whereas diffusion of liquid water was empirically obtained from experiment by adopting Fick's law. Percolation was estimated by following Darcy's law. The model was applied to a column of composting wood chips with an initial MC of 60%. The simulation program was run for four weeks with calculation span of 1 s. The simulated results were in reasonably good agreement with the experimental results. Only a top layer (less than 20 cm) had a considerable MC reduction; the deeper layers were comparable to the initial MC, and the bottom layer was higher than the initial MC. This model is a useful tool to estimate the MC profile throughout the composting period, and could be incorporated into biodegradation kinetic simulation of composting.

  7. Modified Monte Carlo method for study of electron transport in degenerate electron gas in the presence of electron–electron interactions, application to graphene

    International Nuclear Information System (INIS)

    Borowik, Piotr; Thobel, Jean-Luc; Adamowicz, Leszek

    2017-01-01

    Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron–electron (e–e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e–e interactions. This required adapting the treatment of e–e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.

  8. Modified Monte Carlo method for study of electron transport in degenerate electron gas in the presence of electron-electron interactions, application to graphene

    Science.gov (United States)

    Borowik, Piotr; Thobel, Jean-Luc; Adamowicz, Leszek

    2017-07-01

    Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron-electron (e-e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e-e interactions. This required adapting the treatment of e-e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.

  9. Modified Monte Carlo method for study of electron transport in degenerate electron gas in the presence of electron–electron interactions, application to graphene

    Energy Technology Data Exchange (ETDEWEB)

    Borowik, Piotr, E-mail: pborow@poczta.onet.pl [Warsaw University of Technology, Faculty of Physics, ul. Koszykowa 75, 00-662 Warszawa (Poland); Thobel, Jean-Luc, E-mail: jean-luc.thobel@iemn.univ-lille1.fr [Institut d' Electronique, de Microélectronique et de Nanotechnologies, UMR CNRS 8520, Université Lille 1, Avenue Poincaré, CS 60069, 59652 Villeneuve d' Ascq Cédex (France); Adamowicz, Leszek, E-mail: adamo@if.pw.edu.pl [Warsaw University of Technology, Faculty of Physics, ul. Koszykowa 75, 00-662 Warszawa (Poland)

    2017-07-15

    Standard computational methods used to take account of the Pauli Exclusion Principle into Monte Carlo (MC) simulations of electron transport in semiconductors may give unphysical results in low field regime, where obtained electron distribution function takes values exceeding unity. Modified algorithms were already proposed and allow to correctly account for electron scattering on phonons or impurities. Present paper extends this approach and proposes improved simulation scheme allowing including Pauli exclusion principle for electron–electron (e–e) scattering into MC simulations. Simulations with significantly reduced computational cost recreate correct values of the electron distribution function. Proposed algorithm is applied to study transport properties of degenerate electrons in graphene with e–e interactions. This required adapting the treatment of e–e scattering in the case of linear band dispersion relation. Hence, this part of the simulation algorithm is described in details.

  10. Applications of Monte Carlo method to nonlinear regression of rheological data

    Science.gov (United States)

    Kim, Sangmo; Lee, Junghaeng; Kim, Sihyun; Cho, Kwang Soo

    2018-02-01

    In rheological study, it is often to determine the parameters of rheological models from experimental data. Since both rheological data and values of the parameters vary in logarithmic scale and the number of the parameters is quite large, conventional method of nonlinear regression such as Levenberg-Marquardt (LM) method is usually ineffective. The gradient-based method such as LM is apt to be caught in local minima which give unphysical values of the parameters whenever the initial guess of the parameters is far from the global optimum. Although this problem could be solved by simulated annealing (SA), the Monte Carlo (MC) method needs adjustable parameter which could be determined in ad hoc manner. We suggest a simplified version of SA, a kind of MC methods which results in effective values of the parameters of most complicated rheological models such as the Carreau-Yasuda model of steady shear viscosity, discrete relaxation spectrum and zero-shear viscosity as a function of concentration and molecular weight.

  11. Effects of Kinesio Taping versus McConnell Taping for Patellofemoral Pain Syndrome: A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Wen-Dien Chang

    2015-01-01

    Full Text Available Objectives. To conduct a systematic review comparing the effects of Kinesio taping with McConnell taping as a method of conservative management of patients with patellofemoral pain syndrome (PFPS. Methods. MEDLINE, PUBMED, EMBASE, AMED, and the Cochrane Central Register of Control Trials electronic databases were searched through July 2014. Controlled studies evaluating the effects of Kinesio or McConnell taping in PFPS patients were retrieved. Results. Ninety-one articles were selected from the articles that were retrieved from the databases, and 11 articles were included in the analysis. The methods, evaluations, and results of the articles were collected, and the outcomes of patellar tapings were analyzed. Kinesio taping can reduce pain and increase the muscular flexibility of PFPS patients, and McConnell taping also had effect in pain relief and patellar alignment. Meta-analysis showed small effect in pain reduction and motor function improvement and moderate effect in muscle activity change among PFPS patients using Kinesio taping. Conclusions. Kinesio taping technique used for muscles can relieve pain but cannot change patellar alignment, unlike McConnell taping. Both patellar tapings are used differently for PFPS patients and substantially improve muscle activity, motor function, and quality of life.

  12. [Comparison of the Pressure on the Larynx and Tongue Using McGRATH® MAC Video Laryngoscope--Direct Vision versus Indirect Vision].

    Science.gov (United States)

    Tanaka, Yasutomo; Miyazaki, Yukiko; Kitakata, Hidenori; Shibuya, Hiromi; Okada, Toshiki

    2015-12-01

    Studies show that McGRATH® MAC (McG) is useful during direct laryngoscopy. However, no study has examined whether McG re- duces pressure on the upper airway tract We compared direct vision with indirect vision concerning pressure on the larynx and tongue. Twenty two anesthesiologists and 16 junior residents attempted direct laryngoscopy of airway management simulator using McG with direct vision and indirect vision. Pressure was measured using pressure measurement film. In anesthesiologists group, pressure on larynx was 14.8 ± 2.7 kgf · cm(-2) with direct vision and 12.7 ± 2.7 kgf · cm(-2) with indirect vision (P vision and 7.6 ± 2.8 kgf · cm(-2) with indirect vision (P = 0.18). In junior residents group, pressure on larynx was 19.0 ± 1.3 kgf · cm(-2) with direct vision and 14.1 ± 3.1 kgf · cm(-2) with indirect vision (P vision and 11.2 ± 4.7 kgf · cm(-2) with indirect vision (P vision can reduce pressure on the upper airway tract.

  13. SU-E-T-644: QuAArC: A 3D VMAT QA System Based On Radiochromic Film and Monte Carlo Simulation of Log Files

    Energy Technology Data Exchange (ETDEWEB)

    Barbeiro, A.R.; Ureba, A.; Baeza, J.A.; Jimenez-Ortega, E.; Plaza, A. Leal [Universidad de Sevilla, Departamento de Fisiologia Medica y Biofisica, Seville (Spain); Linares, R. [Hospital Infanta Luisa, Servicio de Radiofisica, Seville (Spain); Mateos, J.C.; Velazquez, S. [Hospital Universitario Virgen del Rocio, Servicio de Radiofisica, Seville (Spain)

    2015-06-15

    Purpose: VMAT involves two main sources of uncertainty: one related to the dose calculation accuracy, and the other linked to the continuous delivery of a discrete calculation. The purpose of this work is to present QuAArC, an alternative VMAT QA system to control and potentially reduce these uncertainties. Methods: An automated MC simulation of log files, recorded during VMAT treatment plans delivery, was implemented in order to simulate the actual treatment parameters. The linac head models and the phase-space data of each Control Point (CP) were simulated using the EGSnrc/BEAMnrc MC code, and the corresponding dose calculation was carried out by means of BEAMDOSE, a DOSXYZnrc code modification. A cylindrical phantom was specifically designed to host films rolled up at different radial distances from the isocenter, for a 3D and continuous dosimetric verification. It also allows axial and/or coronal films and point measurements with several types of ion chambers at different locations. Specific software was developed in MATLAB in order to process and evaluate the dosimetric measurements, which incorporates the analysis of dose distributions, profiles, dose difference maps, and 2D/3D gamma index. It is also possible to obtain the experimental DVH reconstructed on the patient CT, by an optimization method to find the individual contribution corresponding to each CP on the film, taking into account the total measured dose, and the corresponding CP dose calculated by MC. Results: The QuAArC system showed high reproducibility of measurements, and consistency with the results obtained with the commercial system implemented in the verification of the evaluated treatment plans. Conclusion: A VMAT QA system based on MC simulation and high resolution dosimetry with film has been developed for treatment verification. It shows to be useful for the study of the real VMAT capabilities, and also for linac commissioning and evaluation of other verification devices.

  14. Numerical simulation methods for electron and ion optics

    International Nuclear Information System (INIS)

    Munro, Eric

    2011-01-01

    This paper summarizes currently used techniques for simulation and computer-aided design in electron and ion beam optics. Topics covered include: field computation, methods for computing optical properties (including Paraxial Rays and Aberration Integrals, Differential Algebra and Direct Ray Tracing), simulation of Coulomb interactions, space charge effects in electron and ion sources, tolerancing, wave optical simulations and optimization. Simulation examples are presented for multipole aberration correctors, Wien filter monochromators, imaging energy filters, magnetic prisms, general curved axis systems and electron mirrors.

  15. A Monte Carlo method and finite volume method coupled optical simulation method for parabolic trough solar collectors

    International Nuclear Information System (INIS)

    Liang, Hongbo; Fan, Man; You, Shijun; Zheng, Wandong; Zhang, Huan; Ye, Tianzhen; Zheng, Xuejing

    2017-01-01

    Highlights: •Four optical models for parabolic trough solar collectors were compared in detail. •Characteristics of Monte Carlo Method and Finite Volume Method were discussed. •A novel method was presented combining advantages of different models. •The method was suited to optical analysis of collectors with different geometries. •A new kind of cavity receiver was simulated depending on the novel method. -- Abstract: The PTC (parabolic trough solar collector) is widely used for space heating, heat-driven refrigeration, solar power, etc. The concentrated solar radiation is the only energy source for a PTC, thus its optical performance significantly affects the collector efficiency. In this study, four different optical models were constructed, validated and compared in detail. On this basis, a novel coupled method was presented by combining advantages of these models, which was suited to carry out a mass of optical simulations of collectors with different geometrical parameters rapidly and accurately. Based on these simulation results, the optimal configuration of a collector with highest efficiency can be determined. Thus, this method was useful for collector optimization and design. In the four models, MCM (Monte Carlo Method) and FVM (Finite Volume Method) were used to initialize photons distribution, as well as CPEM (Change Photon Energy Method) and MCM were adopted to describe the process of reflecting, transmitting and absorbing. For simulating reflection, transmission and absorption, CPEM was more efficient than MCM, so it was utilized in the coupled method. For photons distribution initialization, FVM saved running time and computation effort, whereas it needed suitable grid configuration. MCM only required a total number of rays for simulation, whereas it needed higher computing cost and its results fluctuated in multiple runs. In the novel coupled method, the grid configuration for FVM was optimized according to the “true values” from MCM of

  16. Simulation methods for nuclear production scheduling

    International Nuclear Information System (INIS)

    Miles, W.T.; Markel, L.C.

    1975-01-01

    Recent developments and applications of simulation methods for use in nuclear production scheduling and fuel management are reviewed. The unique characteristics of the nuclear fuel cycle as they relate to the overall optimization of a mixed nuclear-fossil system in both the short-and mid-range time frame are described. Emphasis is placed on the various formulations and approaches to the mid-range planning problem, whose objective is the determination of an optimal (least cost) system operation strategy over a multi-year planning horizon. The decomposition of the mid-range problem into power system simulation, reactor core simulation and nuclear fuel management optimization, and system integration models is discussed. Present utility practices, requirements, and research trends are described. 37 references

  17. Monte Carlo simulation tool for online treatment monitoring in hadrontherapy with in-beam PET: A patient study.

    Science.gov (United States)

    Fiorina, E; Ferrero, V; Pennazio, F; Baroni, G; Battistoni, G; Belcari, N; Cerello, P; Camarlinghi, N; Ciocca, M; Del Guerra, A; Donetti, M; Ferrari, A; Giordanengo, S; Giraudo, G; Mairani, A; Morrocchi, M; Peroni, C; Rivetti, A; Da Rocha Rolo, M D; Rossi, S; Rosso, V; Sala, P; Sportelli, G; Tampellini, S; Valvo, F; Wheadon, R; Bisogni, M G

    2018-05-07

    Hadrontherapy is a method for treating cancer with very targeted dose distributions and enhanced radiobiological effects. To fully exploit these advantages, in vivo range monitoring systems are required. These devices measure, preferably during the treatment, the secondary radiation generated by the beam-tissue interactions. However, since correlation of the secondary radiation distribution with the dose is not straightforward, Monte Carlo (MC) simulations are very important for treatment quality assessment. The INSIDE project constructed an in-beam PET scanner to detect signals generated by the positron-emitting isotopes resulting from projectile-target fragmentation. In addition, a FLUKA-based simulation tool was developed to predict the corresponding reference PET images using a detailed scanner model. The INSIDE in-beam PET was used to monitor two consecutive proton treatment sessions on a patient at the Italian Center for Oncological Hadrontherapy (CNAO). The reconstructed PET images were updated every 10 s providing a near real-time quality assessment. By half-way through the treatment, the statistics of the measured PET images were already significant enough to be compared with the simulations with average differences in the activity range less than 2.5 mm along the beam direction. Without taking into account any preferential direction, differences within 1 mm were found. In this paper, the INSIDE MC simulation tool is described and the results of the first in vivo agreement evaluation are reported. These results have justified a clinical trial, in which the MC simulation tool will be used on a daily basis to study the compliance tolerances between the measured and simulated PET images. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Improved importance sampling technique for efficient simulation of digital communication systems

    Science.gov (United States)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  19. McDonnell Douglas Helicopter Company independent research and development: Preparing for the future

    Science.gov (United States)

    Haggerty, Allen C.

    1988-01-01

    During the 1970's and 80's, research has produced the technology that is seen in aircraft such as the LHX and future models. The technology is discussed that is reaching maturity and moving into the application stage of future programs. Technology is discussed in six major areas: advanced concepts, analysis techniques, structures, systems, simulation, and research and development facilities. The partnership of McDonnell Douglas Helicopter Co. and the government in developing these technologies is illustrated in several programs.

  20. Universal Linear Precoding for NBI-Proof Widely Linear Equalization in MC Systems

    Directory of Open Access Journals (Sweden)

    Donatella Darsena

    2007-09-01

    Full Text Available In multicarrier (MC systems, transmitter redundancy, which is introduced by means of finite-impulse response (FIR linear precoders, allows for perfect or zero-forcing (ZF equalization of FIR channels (in the absence of noise. Recently, it has been shown that the noncircular or improper nature of some symbol constellations offers an intrinsic source of redundancy, which can be exploited to design efficient FIR widely-linear (WL receiving structures for MC systems operating in the presence of narrowband interference (NBI. With regard to both cyclic-prefixed and zero-padded transmission techniques, it is shown in this paper that, with appropriately designed precoders, it is possible to synthesize in both cases WL-ZF universal equalizers, which guarantee perfect symbol recovery for any FIR channel. Furthermore, it is theoretically shown that the intrinsic redundancy of the improper symbol sequence also enables WL-ZF equalization, based on the minimum mean output-energy criterion, with improved NBI suppression capabilities. Finally, results of numerical simulations are presented, which assess the merits of the proposed precoding designs and validate the theoretical analysis carried out.

  1. Official portrait of Astronaut Ronald E. McNair

    Science.gov (United States)

    1985-01-01

    Official portrait of Astronaut Ronald E. McNair. McNair is in the blue shuttle flight suit, standing in front of a table which holds a model of the Space Shuttle. An American flag is visible behind him.

  2. Amorphous silicon EPID calibration for dosimetric applications: comparison of a method based on Monte Carlo prediction of response with existing techniques

    International Nuclear Information System (INIS)

    Parent, L; Fielding, A L; Dance, D R; Seco, J; Evans, P M

    2007-01-01

    For EPID dosimetry, the calibration should ensure that all pixels have a similar response to a given irradiation. A calibration method (MC), using an analytical fit of a Monte Carlo simulated flood field EPID image to correct for the flood field image pixel intensity shape, was proposed. It was compared with the standard flood field calibration (FF), with the use of a water slab placed in the beam to flatten the flood field (WS) and with a multiple field calibration where the EPID was irradiated with a fixed 10 x 10 field for 16 different positions (MF). The EPID was used in its normal configuration (clinical setup) and with an additional 3 mm copper slab (modified setup). Beam asymmetry measured with a diode array was taken into account in MC and WS methods. For both setups, the MC method provided pixel sensitivity values within 3% of those obtained with the MF and WS methods (mean difference 2 ) and IMRT fields to within 3% of that obtained with WS and MF calibrations while differences with images calibrated with the FF method for fields larger than 10 x 10 cm 2 were up to 8%. MC, WS and MF methods all provided a major improvement on the FF method. Advantages and drawbacks of each method were reviewed

  3. Simulating adsorptive expansion of zeolites: application to biomass-derived solutions in contact with silicalite.

    Science.gov (United States)

    Santander, Julian E; Tsapatsis, Michael; Auerbach, Scott M

    2013-04-16

    We have constructed and applied an algorithm to simulate the behavior of zeolite frameworks during liquid adsorption. We applied this approach to compute the adsorption isotherms of furfural-water and hydroxymethyl furfural (HMF)-water mixtures adsorbing in silicalite zeolite at 300 K for comparison with experimental data. We modeled these adsorption processes under two different statistical mechanical ensembles: the grand canonical (V-Nz-μg-T or GC) ensemble keeping volume fixed, and the P-Nz-μg-T (osmotic) ensemble allowing volume to fluctuate. To optimize accuracy and efficiency, we compared pure Monte Carlo (MC) sampling to hybrid MC-molecular dynamics (MD) simulations. For the external furfural-water and HMF-water phases, we assumed the ideal solution approximation and employed a combination of tabulated data and extended ensemble simulations for computing solvation free energies. We found that MC sampling in the V-Nz-μg-T ensemble (i.e., standard GCMC) does a poor job of reproducing both the Henry's law regime and the saturation loadings of these systems. Hybrid MC-MD sampling of the V-Nz-μg-T ensemble, which includes framework vibrations at fixed total volume, provides better results in the Henry's law region, but this approach still does not reproduce experimental saturation loadings. Pure MC sampling of the osmotic ensemble was found to approach experimental saturation loadings more closely, whereas hybrid MC-MD sampling of the osmotic ensemble quantitatively reproduces such loadings because the MC-MD approach naturally allows for locally anisotropic volume changes wherein some pores expand whereas others contract.

  4. Epilepsy and McArdle Disease in A Child

    Directory of Open Access Journals (Sweden)

    Faruk incecik

    2015-03-01

    Full Text Available McArdle's disease, defined by the lack of functional glycogen phosphorylase in striated muscle, is inherited as an autosomal recessive trait. Patients typically suffer from reduced exercise tolerance, with muscle cramps and pain provoked by exercise, along with easy fatigability and weakness after exercise. Following prolonged exertion, contractures, rhabdomyolysis, and myoglobinuria may occur. Central nervous system symptoms have rarely been reported in McArdle disease. In this case report, a 13-year-old boy with epilepsy and McArdle's disease is presented. [Cukurova Med J 2015; 40(Suppl 1: 5-7

  5. Incomplete McCune-Albright Syndrome: A Case Report

    Directory of Open Access Journals (Sweden)

    Nagehan Aslan

    2014-08-01

    Full Text Available Fibrous dysplasia of bone is a genetic, non-inheritable disease that can cause bone pain, bone deformities and fracture. It has a large clinic spectrum from benign monostotic fibrous dysplasia to McCune-Albright syndrome. Rare McCune-Albright syndrome is characterized by precocious puberty, cafe au lait spots and fibrous dysplasia. Herein we presented a case who was preferred to hospital with pathological fractures and diagnosed with Incomplet McCune Albright syndrome because of the lack of endocrine hyperfunction and developed early puberty at clinical course.

  6. Simulations Of Neutron Beam Optic For Neutron Radiography Collimator Using Ray Tracing Methodology

    International Nuclear Information System (INIS)

    Norfarizan Mohd Said; Muhammad Rawi Mohamed Zin

    2014-01-01

    Ray- tracing is a technique for simulating the performance of neutron instruments. McStas, the open-source software package based on a meta-language, is a tool for carrying out ray-tracing simulations. The program has been successfully applied in investigating neutron guide design, flux optimization and other related areas with high complexity and precision. The aim of this paper is to discuss the implementation of ray-tracing technique with McStas for simulating the performance of neutron collimation system developed for imaging system of TRIGA RTP reactor. The code for the simulation was developed and the results are presented. The analysis of the performance is reported and discussed. (author)

  7. A tool for simulating parallel branch-and-bound methods

    Science.gov (United States)

    Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail

    2016-01-01

    The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  8. McDonald’s as a Cultural Brand in the Landscape of Attitudes of Polish Customers

    Directory of Open Access Journals (Sweden)

    Marcin Komańda

    2016-01-01

    Full Text Available Purpose of the article: The analysis of the attitudes of Polish customers towards McDonald’s based on the identification of opposite social attitudes towards globalisation processes and perception of cultural brands. Methodology/methods: The qualitative analysis of the record of Internet users’ discussion has been conducted. The record of the discussion shall be regarded as an expression of opinion by an incidental group of respondents. For the purposes of the conducted research programmes weftQDA 1.0.1 and QSR NVIVO 10 have been used. Scientific aim: Utilization of postmodern interpretation of the socio-cultural context of running business for purposes of strategic management. Findings: The main differences between the supporters of the attitudes towards McDonald’s were related to two problems. Firstly, the discussion concerns what McDonald’s really is (how its service should be classified. Secondly, the thread of the discourse concerns the quality of McDonald’s offer. Further discussion involved the issues of impact of McDonald’s on the domestic business, and lifestyle of contemporary Poles and their dining habits. Conclusions: The landscape of attitudes of Polish customers towards McDonald’s is the issue of uncertainty in the strategic management within this company. It seems there is a need for paying attention to national cultural features of Poles and different attitudes of contemporary society expressed as a postmodern response to globalisation. Each group of problems mentioned may become an opportunity or a threat for McDonald’s business activity in Poland.

  9. Movable geometry and eigenvalue search capability in the MC21 Monte Carlo code

    International Nuclear Information System (INIS)

    Gill, D. F.; Nease, B. R.; Griesheimer, D. P.

    2013-01-01

    A description of a robust and flexible movable geometry implementation in the Monte Carlo code MC21 is described along with a search algorithm that can be used in conjunction with the movable geometry capability to perform eigenvalue searches based on the position of some geometric component. The natural use of the combined movement and search capability is searching to critical through variation of control rod (or control drum) position. The movable geometry discussion provides the mathematical framework for moving surfaces in the MC21 combinatorial solid geometry description. A discussion of the interface between the movable geometry system and the user is also described, particularly the ability to create a hierarchy of movable groups. Combined with the hierarchical geometry description in MC21 the movable group framework provides a very powerful system for inline geometry modification. The eigenvalue search algorithm implemented in MC21 is also described. The foundations of this algorithm are a regula falsi search though several considerations are made in an effort to increase the efficiency of the algorithm for use with Monte Carlo. Specifically, criteria are developed to determine after each batch whether the Monte Carlo calculation should be continued, the search iteration can be rejected, or the search iteration has converged. These criteria seek to minimize the amount of time spent per iteration. Results for the regula falsi method are shown, illustrating that the method as implemented is indeed convergent and that the optimizations made ultimately reduce the total computational expense. (authors)

  10. Movable geometry and eigenvalue search capability in the MC21 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Gill, D. F.; Nease, B. R.; Griesheimer, D. P. [Bettis Atomic Power Laboratory, PO Box 79, West Mifflin, PA 15122 (United States)

    2013-07-01

    A description of a robust and flexible movable geometry implementation in the Monte Carlo code MC21 is described along with a search algorithm that can be used in conjunction with the movable geometry capability to perform eigenvalue searches based on the position of some geometric component. The natural use of the combined movement and search capability is searching to critical through variation of control rod (or control drum) position. The movable geometry discussion provides the mathematical framework for moving surfaces in the MC21 combinatorial solid geometry description. A discussion of the interface between the movable geometry system and the user is also described, particularly the ability to create a hierarchy of movable groups. Combined with the hierarchical geometry description in MC21 the movable group framework provides a very powerful system for inline geometry modification. The eigenvalue search algorithm implemented in MC21 is also described. The foundations of this algorithm are a regula falsi search though several considerations are made in an effort to increase the efficiency of the algorithm for use with Monte Carlo. Specifically, criteria are developed to determine after each batch whether the Monte Carlo calculation should be continued, the search iteration can be rejected, or the search iteration has converged. These criteria seek to minimize the amount of time spent per iteration. Results for the regula falsi method are shown, illustrating that the method as implemented is indeed convergent and that the optimizations made ultimately reduce the total computational expense. (authors)

  11. McJobs and Pieces of Flair: Linking McDonaldization to Alienating Work

    Science.gov (United States)

    Treiber, Linda Ann

    2013-01-01

    This article offers strategies for teaching about rationality, bureaucracy, and social change using George Ritzer's "The McDonaldization of Society" and its ideas about efficiency, predictability, calculability, and control. Student learning is facilitated using a series of strategies: making the familiar strange, explaining…

  12. Real time simulation method for fast breeder reactors dynamics

    International Nuclear Information System (INIS)

    Miki, Tetsushi; Mineo, Yoshiyuki; Ogino, Takamichi; Kishida, Koji; Furuichi, Kenji.

    1985-01-01

    The development of multi-purpose real time simulator models with suitable plant dynamics was made; these models can be used not only in training operators but also in designing control systems, operation sequences and many other items which must be studied for the development of new type reactors. The prototype fast breeder reactor ''Monju'' is taken as an example. Analysis is made on various factors affecting the accuracy and computer load of its dynamic simulation. A method is presented which determines the optimum number of nodes in distributed systems and time steps. The oscillations due to the numerical instability are observed in the dynamic simulation of evaporators with a small number of nodes, and a method to cancel these oscillations is proposed. It has been verified through the development of plant dynamics simulation codes that these methods can provide efficient real time dynamics models of fast breeder reactors. (author)

  13. Melanocortin MC(4) receptor-mediated feeding and grooming in rodents.

    Science.gov (United States)

    Mul, Joram D; Spruijt, Berry M; Brakkee, Jan H; Adan, Roger A H

    2013-11-05

    Decades ago it was recognized that the pharmacological profile of melanocortin ligands that stimulated grooming behavior in rats was strikingly similar to that of Xenopus laevis melanophore pigment dispersion. After cloning of the melanocortin MC1 receptor, expressed in melanocytes, and the melanocortin MC4 receptor, expressed mainly in brain, the pharmacological profiles of these receptors appeared to be very similar and it was demonstrated that these receptors mediate melanocortin-induced pigmentation and grooming respectively. Grooming is a low priority behavior that is concerned with care of body surface. Activation of central melanocortin MC4 receptors is also associated with meal termination, and continued postprandial stimulation of melanocortin MC4 receptors may stimulate natural postprandial grooming behavior as part of the behavioral satiety sequence. Indeed, melanocortins fail to suppress food intake or induce grooming behavior in melanocortin MC4 receptor-deficient rats. This review will focus on how melanocortins affect grooming behavior through the melanocortin MC4 receptor, and how melanocortin MC4 receptors mediate feeding behavior. This review also illustrates how melanocortins were the most likely candidates to mediate grooming and feeding based on the natural behaviors they induced. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Plasma simulations using the Car-Parrinello method

    International Nuclear Information System (INIS)

    Clerouin, J.; Zerah, G.; Benisti, D.; Hansen, J.P.

    1990-01-01

    A simplified version of the Car-Parrinello method, based on the Thomas-Fermi (local density) functional for the electrons, is adapted to the simulation of the ionic dynamics in dense plasmas. The method is illustrated by an explicit application to a degenerate one-dimensional hydrogen plasma

  15. Non-analogue Monte Carlo method, application to neutron simulation; Methode de Monte Carlo non analogue, application a la simulation des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Morillon, B.

    1996-12-31

    With most of the traditional and contemporary techniques, it is still impossible to solve the transport equation if one takes into account a fully detailed geometry and if one studies precisely the interactions between particles and matters. Only the Monte Carlo method offers such a possibility. However with significant attenuation, the natural simulation remains inefficient: it becomes necessary to use biasing techniques where the solution of the adjoint transport equation is essential. The Monte Carlo code Tripoli has been using such techniques successfully for a long time with different approximate adjoint solutions: these methods require from the user to find out some parameters. If this parameters are not optimal or nearly optimal, the biases simulations may bring about small figures of merit. This paper presents a description of the most important biasing techniques of the Monte Carlo code Tripoli ; then we show how to calculate the importance function for general geometry with multigroup cases. We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We compare different biased simulations with the importance function calculated by collision probabilities for one-group and multigroup problems. We have run simulations with new biasing method for one-group transport problems with isotropic shocks and for multigroup problems with anisotropic shocks. The results show that for the one-group and homogeneous geometry transport problems the method is quite optimal without splitting and russian roulette technique but for the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add splitting and russian roulette technique.

  16. Improving queuing service at McDonald's

    Science.gov (United States)

    Koh, Hock Lye; Teh, Su Yean; Wong, Chin Keat; Lim, Hooi Kie; Migin, Melissa W.

    2014-07-01

    Fast food restaurants are popular among price-sensitive youths and working adults who value the conducive environment and convenient services. McDonald's chains of restaurants promote their sales during lunch hours by offering package meals which are perceived to be inexpensive. These promotional lunch meals attract good response, resulting in occasional long queues and inconvenient waiting times. A study is conducted to monitor the distribution of waiting time, queue length, customer arrival and departure patterns at a McDonald's restaurant located in Kuala Lumpur. A customer survey is conducted to gauge customers' satisfaction regarding waiting time and queue length. An android app named Que is developed to perform onsite queuing analysis and report key performance indices. The queuing theory in Que is based upon the concept of Poisson distribution. In this paper, Que is utilized to perform queuing analysis at this McDonald's restaurant with the aim of improving customer service, with particular reference to reducing queuing time and shortening queue length. Some results will be presented.

  17. Generalized eMC implementation for Monte Carlo dose calculation of electron beams from different machine types.

    Science.gov (United States)

    Fix, Michael K; Cygler, Joanna; Frei, Daniel; Volken, Werner; Neuenschwander, Hans; Born, Ernst J; Manser, Peter

    2013-05-07

    The electron Monte Carlo (eMC) dose calculation algorithm available in the Eclipse treatment planning system (Varian Medical Systems) is based on the macro MC method and uses a beam model applicable to Varian linear accelerators. This leads to limitations in accuracy if eMC is applied to non-Varian machines. In this work eMC is generalized to also allow accurate dose calculations for electron beams from Elekta and Siemens accelerators. First, changes made in the previous study to use eMC for low electron beam energies of Varian accelerators are applied. Then, a generalized beam model is developed using a main electron source and a main photon source representing electrons and photons from the scattering foil, respectively, an edge source of electrons, a transmission source of photons and a line source of electrons and photons representing the particles from the scrapers or inserts and head scatter radiation. Regarding the macro MC dose calculation algorithm, the transport code of the secondary particles is improved. The macro MC dose calculations are validated with corresponding dose calculations using EGSnrc in homogeneous and inhomogeneous phantoms. The validation of the generalized eMC is carried out by comparing calculated and measured dose distributions in water for Varian, Elekta and Siemens machines for a variety of beam energies, applicator sizes and SSDs. The comparisons are performed in units of cGy per MU. Overall, a general agreement between calculated and measured dose distributions for all machine types and all combinations of parameters investigated is found to be within 2% or 2 mm. The results of the dose comparisons suggest that the generalized eMC is now suitable to calculate dose distributions for Varian, Elekta and Siemens linear accelerators with sufficient accuracy in the range of the investigated combinations of beam energies, applicator sizes and SSDs.

  18. SimDoseCT: dose reporting software based on Monte Carlo simulation for a 320 detector-row cone-beam CT scanner and ICRP computational adult phantoms

    Science.gov (United States)

    Cros, Maria; Joemai, Raoul M. S.; Geleijns, Jacob; Molina, Diego; Salvadó, Marçal

    2017-08-01

    This study aims to develop and test software for assessing and reporting doses for standard patients undergoing computed tomography (CT) examinations in a 320 detector-row cone-beam scanner. The software, called SimDoseCT, is based on the Monte Carlo (MC) simulation code, which was developed to calculate organ doses and effective doses in ICRP anthropomorphic adult reference computational phantoms for acquisitions with the Aquilion ONE CT scanner (Toshiba). MC simulation was validated by comparing CTDI measurements within standard CT dose phantoms with results from simulation under the same conditions. SimDoseCT consists of a graphical user interface connected to a MySQL database, which contains the look-up-tables that were generated with MC simulations for volumetric acquisitions at different scan positions along the phantom using any tube voltage, bow tie filter, focal spot and nine different beam widths. Two different methods were developed to estimate organ doses and effective doses from acquisitions using other available beam widths in the scanner. A correction factor was used to estimate doses in helical acquisitions. Hence, the user can select any available protocol in the Aquilion ONE scanner for a standard adult male or female and obtain the dose results through the software interface. Agreement within 9% between CTDI measurements and simulations allowed the validation of the MC program. Additionally, the algorithm for dose reporting in SimDoseCT was validated by comparing dose results from this tool with those obtained from MC simulations for three volumetric acquisitions (head, thorax and abdomen). The comparison was repeated using eight different collimations and also for another collimation in a helical abdomen examination. The results showed differences of 0.1 mSv or less for absolute dose in most organs and also in the effective dose calculation. The software provides a suitable tool for dose assessment in standard adult patients undergoing CT

  19. The Travails of Criticality: Understanding Peter McLaren's Revolutionary Vocation. An Article Review of Peter McLaren, "Pedagogy of Insurrection" (New York: Peter Lang, 2015)

    Science.gov (United States)

    Baldacchino, John

    2017-01-01

    This is an article review of Peter McLaren's "Pedagogy of Insurrection" (New York: Peter Lang, 2015). While it seeks to position McLaren's work within the context of critical pedagogy, this paper also assesses McLaren from the wider discussion of Marxist--Hegelian discourse as it evolved within the Left. Engaging with McLaren critically,…

  20. TH-A-19A-08: Intel Xeon Phi Implementation of a Fast Multi-Purpose Monte Carlo Simulation for Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Souris, K; Lee, J; Sterpin, E [Universite catholique de Louvain, Brussels (Belgium)

    2014-06-15

    Purpose: Recent studies have demonstrated the capability of graphics processing units (GPUs) to compute dose distributions using Monte Carlo (MC) methods within clinical time constraints. However, GPUs have a rigid vectorial architecture that favors the implementation of simplified particle transport algorithms, adapted to specific tasks. Our new, fast, and multipurpose MC code, named MCsquare, runs on Intel Xeon Phi coprocessors. This technology offers 60 independent cores, and therefore more flexibility to implement fast and yet generic MC functionalities, such as prompt gamma simulations. Methods: MCsquare implements several models and hence allows users to make their own tradeoff between speed and accuracy. A 200 MeV proton beam is simulated in a heterogeneous phantom using Geant4 and two configurations of MCsquare. The first one is the most conservative and accurate. The method of fictitious interactions handles the interfaces and secondary charged particles emitted in nuclear interactions are fully simulated. The second, faster configuration simplifies interface crossings and simulates only secondary protons after nuclear interaction events. Integral depth-dose and transversal profiles are compared to those of Geant4. Moreover, the production profile of prompt gammas is compared to PENH results. Results: Integral depth dose and transversal profiles computed by MCsquare and Geant4 are within 3%. The production of secondaries from nuclear interactions is slightly inaccurate at interfaces for the fastest configuration of MCsquare but this is unlikely to have any clinical impact. The computation time varies between 90 seconds for the most conservative settings to merely 59 seconds in the fastest configuration. Finally prompt gamma profiles are also in very good agreement with PENH results. Conclusion: Our new, fast, and multi-purpose Monte Carlo code simulates prompt gammas and calculates dose distributions in less than a minute, which complies with clinical time

  1. TH-A-19A-08: Intel Xeon Phi Implementation of a Fast Multi-Purpose Monte Carlo Simulation for Proton Therapy

    International Nuclear Information System (INIS)

    Souris, K; Lee, J; Sterpin, E

    2014-01-01

    Purpose: Recent studies have demonstrated the capability of graphics processing units (GPUs) to compute dose distributions using Monte Carlo (MC) methods within clinical time constraints. However, GPUs have a rigid vectorial architecture that favors the implementation of simplified particle transport algorithms, adapted to specific tasks. Our new, fast, and multipurpose MC code, named MCsquare, runs on Intel Xeon Phi coprocessors. This technology offers 60 independent cores, and therefore more flexibility to implement fast and yet generic MC functionalities, such as prompt gamma simulations. Methods: MCsquare implements several models and hence allows users to make their own tradeoff between speed and accuracy. A 200 MeV proton beam is simulated in a heterogeneous phantom using Geant4 and two configurations of MCsquare. The first one is the most conservative and accurate. The method of fictitious interactions handles the interfaces and secondary charged particles emitted in nuclear interactions are fully simulated. The second, faster configuration simplifies interface crossings and simulates only secondary protons after nuclear interaction events. Integral depth-dose and transversal profiles are compared to those of Geant4. Moreover, the production profile of prompt gammas is compared to PENH results. Results: Integral depth dose and transversal profiles computed by MCsquare and Geant4 are within 3%. The production of secondaries from nuclear interactions is slightly inaccurate at interfaces for the fastest configuration of MCsquare but this is unlikely to have any clinical impact. The computation time varies between 90 seconds for the most conservative settings to merely 59 seconds in the fastest configuration. Finally prompt gamma profiles are also in very good agreement with PENH results. Conclusion: Our new, fast, and multi-purpose Monte Carlo code simulates prompt gammas and calculates dose distributions in less than a minute, which complies with clinical time

  2. Purification and characterization of enterocin MC13 produced by a potential aquaculture probiont Enterococcus faecium MC13 isolated from the gut of Mugil cephalus.

    Science.gov (United States)

    Satish Kumar, R; Kanmani, P; Yuvaraj, N; Paari, K A; Pattukumar, V; Arul, V

    2011-12-01

    A bacteriocin producer strain MC13 was isolated from the gut of Mugil cephalus (grey mullet) and identified as Enterococcus faecium. The bacteriocin of E. faecium MC13 was purified to homogeneity, as confirmed by Tricine sodium dodecyl sulphate - polyacrylamide gel electrophoresis (SDS-PAGE). Reverse-phase high-performance liquid chromatography (HPLC) analysis showed a single active fraction eluted at 26 min, and matrix-assisted laser desorption ionization time of flight (MALDI-TOF) mass spectrometry analysis showed the molecular mass to be 2.148 kDa. The clear zone in native PAGE corresponding to enterocin MC13 band further substantiated its molecular mass. A dialyzed sample (semicrude preparation) of enterocin MC13 was broad spectrum in its action and inhibited important seafood-borne pathogens: Listeria monocytogenes , Vibrio parahaemolyticus, and Vibrio vulnificus. This antibacterial substance was sensitive to proteolytic enzymes: trypsin, protease, and chymotrypsin but insensitive to catalase and lipase, confirming that inhibition was due to the proteinaceous molecule, i.e., bacteriocin, and not due to hydrogen peroxide. Enterocin MC13 tolerated heat treatment (up to 90 °C for 20 min). Enterococcus faecium MC13 was effective in bile salt tolerance, acid tolerance, and adhesion to the HT-29 cell line. These properties reveal the potential of E. faecium MC13 to be a probiotic bacterium. Enterococcus faecium MC13 could be used as potential fish probiotic against pathogens such as V. parahaemolyticus, Vibrio harveyi, and Aeromonas hydrophila in fisheries. Also, this could be a valuable seafood biopreservative against L. monocytogenes.

  3. A tool for simulating parallel branch-and-bound methods

    Directory of Open Access Journals (Sweden)

    Golubeva Yana

    2016-01-01

    Full Text Available The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer’s interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  4. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms.

    Science.gov (United States)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  5. Simulation of plume dynamics by the Lattice Boltzmann Method

    Science.gov (United States)

    Mora, Peter; Yuen, David A.

    2017-09-01

    The Lattice Boltzmann Method (LBM) is a semi-microscopic method to simulate fluid mechanics by modelling distributions of particles moving and colliding on a lattice. We present 2-D simulations using the LBM of a fluid in a rectangular box being heated from below, and cooled from above, with a Rayleigh of Ra = 108, similar to current estimates of the Earth's mantle, and a Prandtl number of 5000. At this Prandtl number, the flow is found to be in the non-inertial regime where the inertial terms denoted I ≪ 1. Hence, the simulations presented lie within the regime of relevance for geodynamical problems. We obtain narrow upwelling plumes with mushroom heads and chutes of downwelling fluid as expected of a flow in the non-inertial regime. The method developed demonstrates that the LBM has great potential for simulating thermal convection and plume dynamics relevant to geodynamics, albeit with some limitations.

  6. Potential effects of the next 100 billion hamburgers sold by McDonald's.

    Science.gov (United States)

    Spencer, Elsa H; Frank, Erica; McIntosh, Nichole F

    2005-05-01

    McDonald's has sold >100 billion beef-based hamburgers worldwide with a potentially considerable health impact. This paper explores whether there would be any advantages if the next 100 billion burgers were instead plant-based burgers. Nutrient composition of the beef hamburger patty and the McVeggie burger patty were obtained from the McDonald's website; sales data were obtained from the McDonald's customer service. Consuming 100 billion McDonald's beef burgers versus the same company's McVeggie burgers would provide, approximately, on average, an additional 550 million pounds of saturated fat and 1.2 billion total pounds of fat, as well as 1 billion fewer pounds of fiber, 660 million fewer pounds of protein, and no difference in calories. These data suggest that the McDonald's new McVeggie burger represents a less harmful fast-food choice than the beef burger.

  7. Novel Methods for Electromagnetic Simulation and Design

    Science.gov (United States)

    2016-08-03

    modeling software that can handle complicated, electrically large objects in a manner that is sufficiently fast to allow design by simulation. 15. SUBJECT...electrically large objects in a manner that is sufficiently fast to allow design by simulation. We also developed new methods for scattering from cavities in a...basis for high fidelity modeling software that can handle complicated, electrically large objects in a manner that is sufficiently fast to allow

  8. A McCollough Effect Generated at Binocular Site

    Directory of Open Access Journals (Sweden)

    Qiujie Weng

    2011-05-01

    Full Text Available Following exposures to alternating gratings with unique combination of orientation and colors, an achromatic grating would appear tinted with its perceived color contingent on the grating's orientation. This orientation-contingent color after effect is called the McCollough effect. The lack of interocular transfer of the McCollough effect suggests that the McCollough effect is primarily established in monocular channels. Here we explored the possibility that the McCollough effect can be induced at a binocular site. During adaptation, a red vertical grating and a green horizontal grating are dichoptically presented to the two eyes. In the ‘binocular rivalry’ condition, these two gratings were constantly presented throughout the adaptation duration and subjects experienced the rivalry between the two gratings. In the ‘physical alternation’ condition, the two dichoptic gratings physically alternated during adaptation, perceptually similar to binocular rivalry. Interestingly, following dichoptic adaptation either in the rivalry condition or in the physical alternation condition, a binocularly viewed achromatic test grating appeared colored depending on its orientation: a vertical grating appeared greenish and a horizontal grating pinkish. In other words, we observed a McCollough effect following dichoptic adaptation, which can only be explained by a binocular site of orientation-contingent color adaptation.

  9. A hybrid multiscale kinetic Monte Carlo method for simulation of copper electrodeposition

    International Nuclear Information System (INIS)

    Zheng Zheming; Stephens, Ryan M.; Braatz, Richard D.; Alkire, Richard C.; Petzold, Linda R.

    2008-01-01

    A hybrid multiscale kinetic Monte Carlo (HMKMC) method for speeding up the simulation of copper electrodeposition is presented. The fast diffusion events are simulated deterministically with a heterogeneous diffusion model which considers site-blocking effects of additives. Chemical reactions are simulated by an accelerated (tau-leaping) method for discrete stochastic simulation which adaptively selects exact discrete stochastic simulation for the appropriate reaction whenever that is necessary. The HMKMC method is seen to be accurate and highly efficient

  10. Corrections to O(α7(lnα)mc2) fine-structure splittings and O(α6(lnα)mc2) energy levels in helium

    International Nuclear Information System (INIS)

    Zhang, T.

    1996-01-01

    Fully relativistic formulas for the energy-level shifts arising from no-pair exchange diagrams of two transverse photons plus an arbitrary number of Coulomb photons are derived in closed form within the external potential Bethe-Salpeter formalism. O(α 7 (lnα)mc 2 ) corrections to the fine-structure splittings of helium are obtained and expressed in terms of expectation values of nonrelativistic operators. O(α 7 mc 2 ) operators from exchange diagrams are found in nonrelativistic approximation. O(α 6 m 2 c 2 /M) nucleus-electron operators contributing to the fine-structure splittings are derived. Nonrelativistic operators of O(α 6 mc 2 ) corrections to the triplet levels of helium are presented. Nonrelativistic operators of O(α 6 (lnα)mc 2 ) corrections to the helium singlet levels and to positronium S levels are derived. O(α 6 m 2 c 2 /M) hydrogen and O(α 6 mc 2 ) positronium P levels, and O(α 6 (lnα)mc 2 ) corrections of first order to positronium S levels, are calculated using the derived operators for helium, in agreement with those obtained previously by others, except for one term in corrections to positronium P levels. In addition, the O(α 6 mc 2 ) Dirac energies for hydrogenic non-S levels are exactly reproduced in a perturbative calculation. copyright 1996 The American Physical Society

  11. Simulation methods with extended stability for stiff biochemical Kinetics

    Directory of Open Access Journals (Sweden)

    Rué Pau

    2010-08-01

    Full Text Available Abstract Background With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (biochemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA. The key quantity is the step size, or waiting time, τ, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where τ can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called τ-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as τ grows. Results In this paper we extend Poisson τ-leap methods to a general class of Runge-Kutta (RK τ-leap methods. We show that with the proper selection of the coefficients, the variance of the extended τ-leap can be well-behaved, leading to significantly larger step sizes. Conclusions The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original τ-leap method. The approach paves the way to explore new multiscale methods to simulate (biochemical systems.

  12. A nondissipative simulation method for the drift kinetic equation

    International Nuclear Information System (INIS)

    Watanabe, Tomo-Hiko; Sugama, Hideo; Sato, Tetsuya

    2001-07-01

    With the aim to study the ion temperature gradient (ITG) driven turbulence, a nondissipative kinetic simulation scheme is developed and comprehensively benchmarked. The new simulation method preserving the time-reversibility of basic kinetic equations can successfully reproduce the analytical solutions of asymmetric three-mode ITG equations which are extended to provide a more general reference for benchmarking than the previous work [T.-H. Watanabe, H. Sugama, and T. Sato: Phys. Plasmas 7 (2000) 984]. It is also applied to a dissipative three-mode system, and shows a good agreement with the analytical solution. The nondissipative simulation result of the ITG turbulence accurately satisfies the entropy balance equation. Usefulness of the nondissipative method for the drift kinetic simulations is confirmed in comparisons with other dissipative schemes. (author)

  13. Ecological effects of contaminants in McCoy Branch, 1989-1990

    Energy Technology Data Exchange (ETDEWEB)

    Ryon, M.G. (ed.)

    1992-01-01

    The 1984 Hazardous and Solid Waste Amendments to the Resource Conservation and Recovery Act (RCRA) required assessment of all current and former solid waste management units. Such a RCRA Facility Investigation (RFI) was required of the Y-12 Plant for their Filled Coal Ash Pond on McCoy Branch. Because the disposal of coal ash in the ash pond, McCoy Branch, and Rogers Quarry was not consistent with the Tennessee Water Quality Act, several remediation steps were implemented or planned for McCoy Branch to address disposal problems. The McCoy Branch RFI plan included provisions for biological monitoring of the McCoy Branch watershed. The objectives of the biological monitoring were to: (1) document changes in biological quality of McCoy Branch after completion of a pipeline and after termination of all discharges to Rogers Quarry, (2) provide guidance on the need for additional remediation, and (3) evaluate the effectiveness of implemented remedial actions. The data from the biological monitoring program will also determine if the classified uses, as identified by the State of Tennessee, of McCoy Branch are being protected and maintained. This report discusses results from toxicity monitoring of snails fish community assessment, and a Benthic macroinvertebrate community assessment.

  14. Effectiveness of McKenzie Method-Based Self-Management Approach for the Secondary Prevention of a Recurrence of Low Back Pain (SAFE Trial): Protocol for a Pragmatic Randomized Controlled Trial.

    Science.gov (United States)

    de Campos, Tarcisio F; Maher, Chris G; Clare, Helen A; da Silva, Tatiane M; Hancock, Mark J

    2017-08-01

    Although many people recover quickly from an episode of low back pain (LBP), recurrence is very common. There is limited evidence on effective prevention strategies for recurrences of LBP. The purpose of this study was to determine the effectiveness of a McKenzie method-based self-management approach in the secondary prevention of LBP. This will be a pragmatic randomized controlled trial. Participants will be recruited from the community and primary care, with the intervention delivered in a number of physical therapist practices in Sydney, Australia. The study will have 396 participants, all of whom are at least 18 years old. Participants will be randomly assigned to either the McKenzie method-based self-management approach group or a minimal intervention control group. The primary outcome will be days to first self-reported recurrence of an episode of activity-limiting LBP. The secondary outcomes will include: days to first self-reported recurrence of an episode of LBP, days to first self-reported recurrence of an episode of LBP leading to care seeking, and the impact of LBP over a 12-month period. All participants will be followed up monthly for a minimum of 12 months or until they have a recurrence of activity-limiting LBP. All participants will also be followed-up at 3, 6, 9, and 12 months to assess the impact of back pain, physical activity levels, study program adherence, credibility, and adverse events. Participants and therapists will not be masked to the interventions. To our knowledge, this will be the first large, high-quality randomized controlled trial investigating the effectiveness of a McKenzie method-based self-management approach for preventing recurrences of LBP. If this approach is found to be effective, it will offer a low-cost, simple method for reducing the personal and societal burdens of LBP. © 2017 American Physical Therapy Association

  15. The neutron instrument simulation package, NISP

    International Nuclear Information System (INIS)

    Seeger, P.A.; Daemen, L.L.

    2004-01-01

    The Neutron Instrument Simulation Package (NISP) performs complete source-to-detector simulations of neutron instruments, including neutrons that do not follow the expected path. The original user interface (MC( ) Web) is a web-based application, http://strider.lansce.lanl.gov/NISP/Welcome.html. This report describes in detail the newer standalone Windows version, NISP( ) Win. Instruments are assembled from menu-selected elements, including neutron sources, collimation and transport elements, samples, analyzers, and detectors. Magnetic field regions may also be specified for the propagation of polarized neutrons including spin precession. Either interface writes a geometry file that is used as input to the Monte Carlo engine (MC( ) Run) in the user's computer. Both the interface and the engine rely on a subroutine library, MCLIB. The package is completely open source. New features include capillary optics, temperature dependence of Al and Be, revised source files for ISIS, and visualization of neutron trajectories at run time. Also, a single-crystal sample type has been successfully imported from McStas (with more generalized geometry), demonstrating the capability of including algorithms from other sources, and NISP( ) Win may render the instrument in a virtual reality file. Results are shown for two instruments under development.

  16. McArdle disease: a case report and review

    Directory of Open Access Journals (Sweden)

    Leite A

    2012-01-01

    Full Text Available Alberto Leite, Narciso Oliveira, Manuela RochaInternal Medicine Department, Hospital de Braga, PortugalAbstract: McArdle disease (glycogen storage disease type V is a pure myopathy caused by an inherited deficit of myophosphorylase. The disease exhibits clinical heterogeneity, but patients typically experience exercise intolerance, acute crises of early fatigue, and contractures, sometimes with rhabdomyolysis and myoglobinuria, triggered by static muscle contractions or dynamic exercise. We present the case of a 54-year-old man with a lifelong history of fatigability, worsening on exertion. Laboratory evaluation revealed significant elevations in levels of creatine kinase (7924 U/L, lactate dehydrogenase (624 U/L, and myoglobulin (671 ng/mL. A muscle biopsy confirmed the presence of McArdle disease. This case report illustrates how, due to embarrassment, the patient hid his symptoms for many years and was eventually extremely relieved and “liberated” once McArdle disease was diagnosed 40 years later.Keywords: McArdle disease, glycogen storage disease, myophosphorylase

  17. Forest canopy BRDF simulation using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.

    2006-01-01

    Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.

  18. Coupling methods for parallel running RELAPSim codes in nuclear power plant simulation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yankai; Lin, Meng, E-mail: linmeng@sjtu.edu.cn; Yang, Yanhua

    2016-02-15

    When the plant is modeled detailedly for high precision, it is hard to achieve real-time calculation for one single RELAP5 in a large-scale simulation. To improve the speed and ensure the precision of simulation at the same time, coupling methods for parallel running RELAPSim codes were proposed in this study. Explicit coupling method via coupling boundaries was realized based on a data-exchange and procedure-control environment. Compromise of synchronization frequency was well considered to improve the precision of simulation and guarantee the real-time simulation at the same time. The coupling methods were assessed using both single-phase flow models and two-phase flow models and good agreements were obtained between the splitting–coupling models and the integrated model. The mitigation of SGTR was performed as an integral application of the coupling models. A large-scope NPP simulator was developed adopting six splitting–coupling models of RELAPSim and other simulation codes. The coupling models could improve the speed of simulation significantly and make it possible for real-time calculation. In this paper, the coupling of the models in the engineering simulator is taken as an example to expound the coupling methods, i.e., coupling between parallel running RELAPSim codes, and coupling between RELAPSim code and other types of simulation codes. However, the coupling methods are also referable in other simulator, for example, a simulator employing ATHLETE instead of RELAP5, other logic code instead of SIMULINK. It is believed the coupling method is commonly used for NPP simulator regardless of the specific codes chosen in this paper.

  19. Methods of channeling simulation

    International Nuclear Information System (INIS)

    Barrett, J.H.

    1989-06-01

    Many computer simulation programs have been used to interpret experiments almost since the first channeling measurements were made. Certain aspects of these programs are important in how accurately they simulate ions in crystals; among these are the manner in which the structure of the crystal is incorporated, how any quantity of interest is computed, what ion-atom potential is used, how deflections are computed from the potential, incorporation of thermal vibrations of the lattice atoms, correlations of thermal vibrations, and form of stopping power. Other aspects of the programs are included to improve the speed; among these are table lookup, importance sampling, and the multiparameter method. It is desirable for programs to facilitate incorporation of special features of interest in special situations; examples are relaxations and enhanced vibrations of surface atoms, easy substitution of an alternate potential for comparison, change of row directions from layer to layer in strained-layer lattices, and different vibration amplitudes for substitutional solute or impurity atoms. Ways of implementing all of these aspects and features and the consequences of them will be discussed. 30 refs., 3 figs

  20. SU-C-BRC-06: OpenCL-Based Cross-Platform Monte Carlo Simulation Package for Carbon Ion Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Qin, N; Tian, Z; Pompos, A; Jiang, S; Jia, X [UT Southwestern Medical Ctr, Dallas, TX (United States); Pinto, M; Dedes, G; Parodi, K [Ludwig-Maximilians-Universitaet Muenchen, Garching / Munich (Germany)

    2016-06-15

    Purpose: Monte Carlo (MC) simulation is considered to be the most accurate method for calculation of absorbed dose and fundamental physical quantities related to biological effects in carbon ion therapy. Its long computation time impedes clinical and research applications. We have developed an MC package, goCMC, on parallel processing platforms, aiming at achieving accurate and efficient simulations for carbon therapy. Methods: goCMC was developed under OpenCL framework. It supported transport simulation in voxelized geometry with kinetic energy up to 450 MeV/u. Class II condensed history algorithm was employed for charged particle transport with stopping power computed via Bethe-Bloch equation. Secondary electrons were not transported with their energy locally deposited. Energy straggling and multiple scattering were modeled. Production of secondary charged particles from nuclear interactions was implemented based on cross section and yield data from Geant4. They were transported via the condensed history scheme. goCMC supported scoring various quantities of interest e.g. physical dose, particle fluence, spectrum, linear energy transfer, and positron emitting nuclei. Results: goCMC has been benchmarked against Geant4 with different phantoms and beam energies. For 100 MeV/u, 250 MeV/u and 400 MeV/u beams impinging to a water phantom, range difference was 0.03 mm, 0.20 mm and 0.53 mm, and mean dose difference was 0.47%, 0.72% and 0.79%, respectively. goCMC can run on various computing devices. Depending on the beam energy and voxel size, it took 20∼100 seconds to simulate 10{sup 7} carbons on an AMD Radeon GPU card. The corresponding CPU time for Geant4 with the same setup was 60∼100 hours. Conclusion: We have developed an OpenCL-based cross-platform carbon MC simulation package, goCMC. Its accuracy, efficiency and portability make goCMC attractive for research and clinical applications in carbon therapy.

  1. Comparing different methods for estimating radiation dose to the conceptus

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Rendon, X.; Dedulle, A. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); Walgraeve, M.S.; Woussen, S.; Zhang, G. [University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Bosmans, H. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); University Hospitals Leuven, Department of Radiology, Leuven (Belgium); Zanca, F. [KU Leuven, Department of Imaging and Pathology, Division of Medical Physics and Quality Assessment, Herestraat 49, box 7003, Leuven (Belgium); GE Healthcare, Buc (France)

    2017-02-15

    To compare different methods available in the literature for estimating radiation dose to the conceptus (D{sub conceptus}) against a patient-specific Monte Carlo (MC) simulation and a commercial software package (CSP). Eight voxel models from abdominopelvic CT exams of pregnant patients were generated. D{sub conceptus} was calculated with an MC framework including patient-specific longitudinal tube current modulation (TCM). For the same patients, dose to the uterus, D{sub uterus}, was calculated as an alternative for D{sub conceptus}, with a CSP that uses a standard-size, non-pregnant phantom and a generic TCM curve. The percentage error between D{sub uterus} and D{sub conceptus} was studied. Dose to the conceptus and percent error with respect to D{sub conceptus} was also estimated for three methods in the literature. The percentage error ranged from -15.9% to 40.0% when comparing MC to CSP. When comparing the TCM profiles with the generic TCM profile from the CSP, differences were observed due to patient habitus and conceptus position. For the other methods, the percentage error ranged from -30.1% to 13.5% but applicability was limited. Estimating an accurate D{sub conceptus} requires a patient-specific approach that the CSP investigated cannot provide. Available methods in the literature can provide a better estimation if applicable to patient-specific cases. (orig.)

  2. Simulation evaluation of NIST air-kerma rate calibration standard for electronic brachytherapy.

    Science.gov (United States)

    Hiatt, Jessica R; Rivard, Mark J; Hughes, H Grady

    2016-03-01

    Dosimetry for the model S700 50 kV electronic brachytherapy (eBT) source (Xoft, Inc., a subsidiary of iCAD, San Jose, CA) was simulated using Monte Carlo (MC) methods by Rivard et al. ["Calculated and measured brachytherapy dosimetry parameters in water for the Xoft Axxent x-ray source: An electronic brachytherapy source," Med. Phys. 33, 4020-4032 (2006)] and recently by Hiatt et al. ["A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model," Med. Phys. 42, 2764-2776 (2015)] with improved geometric characterization. While these studies examined the dose distribution in water, there have not previously been reports of the eBT source calibration methods beyond that recently reported by Seltzer et al. ["New national air-kerma standard for low-energy electronic brachytherapy sources," J. Res. Natl. Inst. Stand. Technol. 119, 554-574 (2014)]. Therefore, the motivation for the current study was to provide an independent determination of air-kerma rate at 50 cm in air K̇air(d=50 cm) using MC methods for the model S700 eBT source. Using CAD information provided by the vendor and disassembled sources, an MC model was created for the S700 eBT source. Simulations were run using the mcnp6 radiation transport code for the NIST Lamperti air ionization chamber according to specifications by Boutillon et al. ["Comparison of exposure standards in the 10-50 kV x-ray region," Metrologia 5, 1-11 (1969)], in air without the Lamperti chamber, and in vacuum without the Lamperti chamber. K̇air(d=50 cm) was determined using the *F4 tally with NIST values for the mass energy-absorption coefficients for air. Photon spectra were evaluated over 2 π azimuthal sampling for polar angles of 0° ≤ θ ≤ 180° every 1°. Volume averaging was averted through tight radial binning. Photon energy spectra were determined over all polar angles in both air and vacuum using

  3. McStas event logger

    DEFF Research Database (Denmark)

    Bergbäck Knudsen, Erik; Willendrup, Peter Kjær; Klinkby, Esben Bryndt

    2014-01-01

    Functionality is added to the McStas neutron ray-tracing code, which allows individual neutron states before and after a scattering to be temporarily stored, and analysed. This logging mechanism has multiple uses, including studies of longitudinal intensity loss in neutron guides and guide coatin...

  4. McKenzie Classification of Extremity Lesions - An audit of primary care patients in 3 clinics

    DEFF Research Database (Denmark)

    Melbye, Martin

    2007-01-01

    Syndrome classification based on mechanical testing guides clinical decision making in conservative musculoskeletal care. The aim of this audit was to investigate how many patients presenting with problems in the extremities could be classified into the mechanical syndromes described by Robin Mc...... ranged from 4,5 to 6 years. The mechanical classification  determined by the therapists,  and was recorded on the first three visits. Mechanical classification was based on strict operational definitions. Assessment sheets were collected from each therapist, to determine their adherence...... to the operational definitions. 135 consecutive patients were included over an 18 months period and 28 patients were excluded. Of  the 107 patients with extremity joint problems, 73% were classified into one of McKenzie's mechanical syndromes by therapists trained in the McKenzie method. 34% of patients were...

  5. A regularized vortex-particle mesh method for large eddy simulation

    Science.gov (United States)

    Spietz, H. J.; Walther, J. H.; Hejlesen, M. M.

    2017-11-01

    We present recent developments of the remeshed vortex particle-mesh method for simulating incompressible fluid flow. The presented method relies on a parallel higher-order FFT based solver for the Poisson equation. Arbitrary high order is achieved through regularization of singular Green's function solutions to the Poisson equation and recently we have derived novel high order solutions for a mixture of open and periodic domains. With this approach the simulated variables may formally be viewed as the approximate solution to the filtered Navier Stokes equations, hence we use the method for Large Eddy Simulation by including a dynamic subfilter-scale model based on test-filters compatible with the aforementioned regularization functions. Further the subfilter-scale model uses Lagrangian averaging, which is a natural candidate in light of the Lagrangian nature of vortex particle methods. A multiresolution variation of the method is applied to simulate the benchmark problem of the flow past a square cylinder at Re = 22000 and the obtained results are compared to results from the literature.

  6. McClintock's challenge in the 21st century

    KAUST Repository

    Fedoroff, Nina V.

    2012-11-13

    In 1950, Barbara McClintock published a Classic PNAS article, "The origin and behavior of mutable loci in maize," which summarized the evidence leading to her discovery of transposition. The article described a number of genome alterations revealed through her studies of the Dissociation locus, the first mobile genetic element she identified. McClintock described the suite of nuclear events, including transposon activation and various chromosome aberrations and rearrangements, that unfolded in the wake of genetic crosses that brought together two broken chromosomes 9. McClintock left future generations with the challenge of understanding how genomes respond to genetic and environmental stresses by mounting adaptive responses that frequently include genome restructuring.

  7. McEvoy, Kieran; McGregor, Lorna, Transitional Justice from below. Grassroots Activism and the Struggle for Change

    Directory of Open Access Journals (Sweden)

    José M. Atiles-Osoria

    2012-10-01

    Full Text Available El texto editado por Kieran McEvoy y Lorna McGregor representa un esfuerzo por repensar, redefinir e introducir un debate en el seno de la literatura y de las corrientes de estudio sobre la justicia transicional. Generalmente, la justicia transicional ha sido pensada como un conglomerado de estrategias jurídico‑políticas y socio‑económicas implementadas para lidiar con las violaciones de los derechos humanos, con la violencia política del pasado y los procesos de reconstrucción del Estado pos...

  8. Meelis Lao kuulutas eile sõja McDonald'sile / Peeter Raidla

    Index Scriptorium Estoniae

    Raidla, Peeter, 1955-

    2004-01-01

    Ärimees Meelis Lao lasi rendivaidlusele viidates sulgeda Viru tänava kiirtoitlustusasutuse McDonald's. Vt. samas: McDonald'si rendivaidlus küünib aastate taha; McDonald's andis asja politseisse; Meelis Lao kannab probleemide lahendaja tiitlit

  9. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    Science.gov (United States)

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Astronauts McNair and Stewart prepare for reentry

    Science.gov (United States)

    1984-01-01

    Astronauts Ronald E. McNair and Robert L. Stewart prepare for the re-entry phase of the shuttle Challenger near the end of the 41-B mission. The are stationed behind the crew commander and pilot. Stewart is already wearing his helmet. McNair is stowing some of his gear.

  11. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  12. A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.

    Science.gov (United States)

    Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L

    2018-05-16

    During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.

  13. Drilling and testing specifications for the McGee well

    International Nuclear Information System (INIS)

    Patterson, J.K.

    1982-01-01

    The McGee Well is a part of the Basalt Waste Isolation Project's subsurface site selection and characterization activities. Information from the McGee Well support site hydrologic characterization and repository design. These test specifications include details for the drilling and testing of the McGee. It includes the predicted stratigraphy, the drilling requirements, description of tests to be conducted, intervals selected for hydrologic testing, and a schedule of the drilling and testing activities. 19 refs., 10 figs., 7 tabs

  14. Developing and design a website for mc kalla oy

    OpenAIRE

    Bekele, Henok

    2013-01-01

    This bachelor thesis is about Website development and design. I have a chance to work with Mc kalla Oy. Mc kalla Oy is a construction company from Kempele which was founded 2011. They have projects in Central-Finland, through Northern Finland to Lapland. This thesis is to develop and design a new website to Mc kalla Oy. Wordpress is used to develop the new website. For the development process I use school server (.opiskelijaprojektit.net). The Thesis contains two main parts designing the ...

  15. EFFECTIVENESS OF MC KENZIE EXERCISES IN REDUCING NECK AND BACK PAIN AMONG MADRASSA STUDENTS

    Directory of Open Access Journals (Sweden)

    Saima Aziz

    2016-02-01

    Full Text Available Background: In this advanced era, neck and back pain has become a common musculoskeletal problem. These symptoms have a high prevalence in the community and now they are affecting even our adolescents leaving a major impact on youth’s functional and educational activities. Nevertheless, the burden of these musculoskeletal pains, which relates not only to its prevalence but also to increase in physiological and psychological stress among them, distressing their creativity. Madrassa students have a daily exposure to neck and back pain due to poor posture. The McKenzie method is a popular treatment for back and neck pain among physical therapists. So, the intention of this study is to test the effectiveness of McKenzie exercises in neck and back pain, because hardly any data is available on McKenzie technique and its outcome in Pakistan. The objective of the study is to determine the effectiveness of McKenzie exercises in reducing neck and back pain among madrassa students. Methods: The students were recruited from Madrassa Darul Akram (Baldia town and Jamia Ashraf-ul-madrassa (Gulshan-e-Iqbal Karachi. One sixty three students aged between 12–18 years of both genders who were fulfilling the inclusion criteria were selected from Madrassa Darul Akram (Baldia town and Jamia Ashraf-ul-madrassa (Gulshane-Iqbal. The participants received McKenzie exercises programs intervention for three consecutive weeks. Outcome Measure: Neck Disability Index (NDI, Modified Oswestry Low Back Pain Disability Index (ODI and Numeric Pain Rating Scale (NPRS Results: The present study showed significant results in all three scales in both genders (p<0.001.Among sections, the Hafiz students revealed greater score in all 3 scales before treatment as compared to ‘Alim/Alima’ and ‘Nazra’ students and after treatment showed significant results in all 3 domains (p<0.001. Conclusion: Findings of this study revealed that madrassa students were more prone to develop neck

  16. McDonald’s Corporation - 2015 (MCD

    Directory of Open Access Journals (Sweden)

    Alen Badal

    2017-07-01

    Full Text Available McDonald’s Corporation, 2015 is aiming to enlighten the “Experience of the Future” for consumers, with a special focus on the ‘younger’ generation. Beginning in 2015 and moving forward, McDonald’s has operationalized the functions of its strategy to bett er serve consumers with such offerings as trial-testing a build-your-burger strategy with the order being served at the table, known as the “Create Your Taste” program. The restaurant chain has introduced the all-day breakfast menu and ‘McPick 2’ for $5.00. Additionally, the company has engaged consumers by way of social media and is interested in having a smart phone application in use. Other roll-outs include processing transactions by way of mobile-payment with such channels as ‘Google Wallet, Soft card and Apple Pay.’ The fast-food giant continues to test a variety of strategies at select locations aimed at increasing shareholder value as a result of both introducing and modifying the point-of-sale services and food & beverage offerings¹.

  17. Ecological effects of contaminants in McCoy Branch, 1991--1993

    Energy Technology Data Exchange (ETDEWEB)

    Ryon, M.G. [ed.

    1996-09-01

    The 1984 Hazardous and Solid Waste Amendments to the Resource Conservation and Recovery Act (RCRA) required assessment of all current and former solid waste management units. Following guidelines under RCRA and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a remedial investigation (RI) was required of the Y-12 Plant for their filled coal ash pond (FCAP) and associated areas on McCoy Branch. The RI process was initiated and assessments were presented. Because the disposal of coal ash in the ash pond, McCoy Branch, and Rogers Quarry was not consistent with the Tennessee Water Quality Act, several remediation steps were implemented between 1986 and 1994 for McCoy Branch to address disposal problems. The required ecological risk assessments of McCoy Branch watershed included provisions for biological monitoring of the watershed. The objectives of the biological monitoring were to (1) document changes in biological quality of McCoy Branch after completion of a pipeline bypassing upper McCoy Branch and further, after termination of all discharges to Rogers Quarry, (2) provide guidance on the need for additional remediation, and (3) evaluate the effectiveness of implemented remedial actions. The data from the biological monitoring program may also determine whether the goals of protection of human health and the environment of McCoy Branch are being accomplished.

  18. Red hair is the null phenotype of MC1R.

    Science.gov (United States)

    Beaumont, Kimberley A; Shekar, Sri N; Cook, Anthony L; Duffy, David L; Sturm, Richard A

    2008-08-01

    The Melanocortin-1 Receptor (MC1R) is a G-protein coupled receptor, which is responsible for production of the darker eumelanin pigment and the tanning response. The MC1R gene has many polymorphisms, some of which have been linked to variation in pigmentation phenotypes within human populations. In particular, the p.D84E, p.R151C, p.R160W and p.D294 H alleles have been strongly associated with red hair, fair skin and increased skin cancer risk. These red hair colour (RHC) variants are relatively well described and are thought to result in altered receptor function, while still retaining varying levels of signaling ability in vitro. The mouse Mc1r null phenotype is yellow fur colour, the p.R151C, p.R160W and p.D294 H alleles were able to partially rescue this phenotype, leading to the question of what the true null phenotype of MC1R would be in humans. Due to the rarity of MC1R null alleles in human populations, they have only been found in the heterozygous state until now. We report here the first case of a homozygous MC1R null individual, phenotypic analysis indicates that red hair and fair skin is found in the absence of MC1R function.

  19. The DoE method as an efficient tool for modeling the behavior of monocrystalline Si-PV module

    Science.gov (United States)

    Kessaissia, Fatma Zohra; Zegaoui, Abdallah; Boutoubat, Mohamed; Allouache, Hadj; Aillerie, Michel; Charles, Jean-Pierre

    2018-05-01

    The objective of this paper is to apply the Design of Experiments (DoE) method to study and to obtain a predictive model of any marketed monocrystalline photovoltaic (mc-PV) module. This technique allows us to have a mathematical model that represents the predicted responses depending upon input factors and experimental data. Therefore, the DoE model for characterization and modeling of mc-PV module behavior can be obtained by just performing a set of experimental trials. The DoE model of the mc-PV panel evaluates the predictive maximum power, as a function of irradiation and temperature in a bounded domain of study for inputs. For the mc-PV panel, the predictive model for both one level and two levels were developed taking into account both influences of the main effect and the interactive effects on the considered factors. The DoE method is then implemented by developing a code under Matlab software. The code allows us to simulate, characterize, and validate the predictive model of the mc-PV panel. The calculated results were compared to the experimental data, errors were estimated, and an accurate validation of the predictive models was evaluated by the surface response. Finally, we conclude that the predictive models reproduce the experimental trials and are defined within a good accuracy.

  20. Efficient SPECT scatter calculation in non-uniform media using correlated Monte Carlo simulation

    International Nuclear Information System (INIS)

    Beekman, F.J.

    1999-01-01

    Accurate simulation of scatter in projection data of single photon emission computed tomography (SPECT) is computationally extremely demanding for activity distribution in non-uniform dense media. This paper suggests how the computation time and memory requirements can be significantly reduced. First the scatter projection of a uniform dense object (P SDSE ) is calculated using a previously developed accurate and fast method which includes all orders of scatter (slab-derived scatter estimation), and then P SDSE is transformed towards the desired projection P which is based on the non-uniform object. The transform of P SDSE is based on two first-order Compton scatter Monte Carlo (MC) simulated projections. One is based on the uniform object (P u ) and the other on the object with non-uniformities (P ν ). P is estimated by P-tilde=P SDSE P ν /P u . A tremendous decrease in noise in P-tilde is achieved by tracking photon paths for P ν identical to those which were tracked for the calculation of P u and by using analytical rather than stochastic modelling of the collimator. The method was validated by comparing the results with standard MC-simulated scatter projections (P) of 99m Tc and 201 Tl point sources in a digital thorax phantom. After correction, excellent agreement was obtained between P-tilde and P. The total computation time required to calculate an accurate scatter projection of an extended distribution in a thorax phantom on a PC is a only few tens of seconds per projection, which makes the method attractive for application in accurate scatter correction in clinical SPECT. Furthermore, the method removes the need of excessive computer memory involved with previously proposed 3D model-based scatter correction methods. (author)

  1. Tracheal intubation in patients with cervical spine immobilization: A comparison of McGrath(®) video laryngoscope and Truview EVO2(®) laryngoscope.

    Science.gov (United States)

    Bhola, Ruchi; Bhalla, Swaran; Gupta, Radha; Singh, Ishwar; Kumar, Sunil

    2014-05-01

    Literature suggests that glottic view is better when using McGrath(®) Video laryngoscope and Truview(®) in comparison with McIntosh blade. The purpose of this study was to evaluate the effectiveness of McGrath Video laryngoscope in comparison with Truview laryngoscope for tracheal intubation in patients with simulated cervical spine injury using manual in-line stabilisation. This prospective randomised study was undertaken in operation theatre of a tertiary referral centre after approval from the Institutional Review Board. A total of 100 consenting patients presenting for elective surgery requiring tracheal intubation were randomly assigned to undergo intubation using McGrath(®) Video laryngoscope (n = 50) or Truview(®) (n = 50) laryngoscope. In all patients, we applied manual-in-line stabilisation of the cervical spine throughout the airway management. Statistical testing was conducted with the statistical package for the social science system version SPSS 17.0. Demographic data, airway assessment and haemodynamics were compared using the Chi-square test. A P manual-in-line stabilisation with 100% success rate and good glottic view.

  2. A Detailed Comparison of Multidimensional Boltzmann Neutrino Transport Methods in Core-collapse Supernovae

    International Nuclear Information System (INIS)

    Richers, Sherwood; Nagakura, Hiroki; Ott, Christian D.; Dolence, Joshua; Sumiyoshi, Kohsuke; Yamada, Shoichi

    2017-01-01

    The mechanism driving core-collapse supernovae is sensitive to the interplay between matter and neutrino radiation. However, neutrino radiation transport is very difficult to simulate, and several radiation transport methods of varying levels of approximation are available. We carefully compare for the first time in multiple spatial dimensions the discrete ordinates (DO) code of Nagakura, Yamada, and Sumiyoshi and the Monte Carlo (MC) code Sedonu, under the assumptions of a static fluid background, flat spacetime, elastic scattering, and full special relativity. We find remarkably good agreement in all spectral, angular, and fluid interaction quantities, lending confidence to both methods. The DO method excels in determining the heating and cooling rates in the optically thick region. The MC method predicts sharper angular features due to the effectively infinite angular resolution, but struggles to drive down noise in quantities where subtractive cancellation is prevalent, such as the net gain in the protoneutron star and off-diagonal components of the Eddington tensor. We also find that errors in the angular moments of the distribution functions induced by neglecting velocity dependence are subdominant to those from limited momentum-space resolution. We briefly compare directly computed second angular moments to those predicted by popular algebraic two-moment closures, and we find that the errors from the approximate closures are comparable to the difference between the DO and MC methods. Included in this work is an improved Sedonu code, which now implements a fully special relativistic, time-independent version of the grid-agnostic MC random walk approximation.

  3. A Detailed Comparison of Multidimensional Boltzmann Neutrino Transport Methods in Core-collapse Supernovae

    Energy Technology Data Exchange (ETDEWEB)

    Richers, Sherwood; Nagakura, Hiroki; Ott, Christian D. [TAPIR, Walter Burke Institute for Theoretical Physics, Mail code 350-17, California Institute of Technology, Pasadena, CA 91125 (United States); Dolence, Joshua [CCS-2, Los Alamos National Laboratory, P.O. Box 1663 Los Alamos, NM 87545 (United States); Sumiyoshi, Kohsuke [Numazu College of Technology, Ooka 3600, Numazu, Shizuoka 410-8501 (Japan); Yamada, Shoichi, E-mail: srichers@tapir.caltech.edu [Advanced Research Institute for Science and Engineering, Waseda University, 3-4-1 Okubo, Shinjuku, Tokyo 169-8555 (Japan)

    2017-10-01

    The mechanism driving core-collapse supernovae is sensitive to the interplay between matter and neutrino radiation. However, neutrino radiation transport is very difficult to simulate, and several radiation transport methods of varying levels of approximation are available. We carefully compare for the first time in multiple spatial dimensions the discrete ordinates (DO) code of Nagakura, Yamada, and Sumiyoshi and the Monte Carlo (MC) code Sedonu, under the assumptions of a static fluid background, flat spacetime, elastic scattering, and full special relativity. We find remarkably good agreement in all spectral, angular, and fluid interaction quantities, lending confidence to both methods. The DO method excels in determining the heating and cooling rates in the optically thick region. The MC method predicts sharper angular features due to the effectively infinite angular resolution, but struggles to drive down noise in quantities where subtractive cancellation is prevalent, such as the net gain in the protoneutron star and off-diagonal components of the Eddington tensor. We also find that errors in the angular moments of the distribution functions induced by neglecting velocity dependence are subdominant to those from limited momentum-space resolution. We briefly compare directly computed second angular moments to those predicted by popular algebraic two-moment closures, and we find that the errors from the approximate closures are comparable to the difference between the DO and MC methods. Included in this work is an improved Sedonu code, which now implements a fully special relativistic, time-independent version of the grid-agnostic MC random walk approximation.

  4. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    International Nuclear Information System (INIS)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J; Perl, J; Piersimoni, P; Ramos-Mendez, J; Faddegon, B

    2016-01-01

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  5. WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio

    Energy Technology Data Exchange (ETDEWEB)

    McNamara, A; Held, K; Paganetti, H; Schuemann, J [Massachusetts General Hospital & Harvard Med. School, Boston, MA (United States); Perl, J [Stanford Linear Accelerator Center, Menlo Park, CA (United States); Piersimoni, P; Ramos-Mendez, J; Faddegon, B [University of California, San Francisco, San Francisco, CA (United States)

    2016-06-15

    Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecular geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex

  6. McCarthyism and American Opera L’Opéra américain face au McCarthyisme

    Directory of Open Access Journals (Sweden)

    Klaus-Dieter Gross

    2009-11-01

    Full Text Available L’atmosphère anti-communiste qui caractérisa les États-Unis entre la fin de la seconde Guerre Mondiale et la fin des années 1950, et qui culmina avec des mesures légales ou extra-légales prises par le Comité des Activités Anti-américaines et par le sénateur McCarthy, eut une influence décisive sur l’opéra américain. Quelques rares ouvrages manifestent un soutien diffus pour le McCarthyisme (Still et Nabobov, tandis que d’autres nient son impact en promouvant des idées de gauche (Robinson et Blitzstein. D’autres œuvres incorporent la peur de la vague rouge dans leurs thèmes, mais sans lui donner une importance autre que secondaire. Enfin, trois opéras (Bernstein, Floyd et Ward abordent frontalement la question du fonctionnement rituel et des méthodes liberticides du McCarthyisme. Le déclin de ce mouvement coïncida, vers le début des années 1960, avec l’émergence d’un style opératique moins traditionaliste et plus abstrait.

  7. Pengaruh Citra Merek dan Pandangan Kualitas Terhadap Kepercayaan Konsumen Pada McDonald’s Bandung

    Directory of Open Access Journals (Sweden)

    Elsa Yolanda

    2016-03-01

    Full Text Available ABSTRACT  - Build consumer trust in the company is very important, variable brand image and pe rceived quality of the view one of the variables in Indonesia owned franchises such as McDonald's. The purpose of this study was to determine the influence of brand image and perceived quality for consumer trust. Sampleof 100 respondentswereused, namelyconsumerswhopurchaseproductsMcDonald's. The method of research used descriptive and regression  analysis  includes  quantitative  validity  and  reliability,  the  classic  assumption  test, multiple regression analysis, determination test, correlation coefficient test, F test. The results prove that the two independent variables brand image, and perceived quality has a positive and significant effect on the dependent variable is consumer trust in McDonald's. Greatest positive effect on consumer confidence is the variable perceived quality. Keywords:  brand image, perceived quality

  8. Tabulated square-shaped source model for linear accelerator electron beam simulation.

    Science.gov (United States)

    Khaledi, Navid; Aghamiri, Mahmood Reza; Aslian, Hossein; Ameri, Ahmad

    2017-01-01

    Using this source model, the Monte Carlo (MC) computation becomes much faster for electron beams. The aim of this study was to present a source model that makes linear accelerator (LINAC) electron beam geometry simulation less complex. In this study, a tabulated square-shaped source with transversal and axial distribution biasing and semi-Gaussian spectrum was investigated. A low energy photon spectrum was added to the semi-Gaussian beam to correct the bremsstrahlung X-ray contamination. After running the MC code multiple times and optimizing all spectrums for four electron energies in three different medical LINACs (Elekta, Siemens, and Varian), the characteristics of a beam passing through a 10 cm × 10 cm applicator were obtained. The percentage depth dose and dose profiles at two different depths were measured and simulated. The maximum difference between simulated and measured percentage of depth doses and dose profiles was 1.8% and 4%, respectively. The low energy electron and photon spectrum and the Gaussian spectrum peak energy and associated full width at half of maximum and transversal distribution weightings were obtained for each electron beam. The proposed method yielded a maximum computation time 702 times faster than a complete head simulation. Our study demonstrates that there was an excellent agreement between the results of our proposed model and measured data; furthermore, an optimum calculation speed was achieved because there was no need to define geometry and materials in the LINAC head.

  9. The adaptation method in the Monte Carlo simulation for computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)

    2015-06-15

    The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.

  10. A multiscale quantum mechanics/electromagnetics method for device simulations.

    Science.gov (United States)

    Yam, ChiYung; Meng, Lingyi; Zhang, Yu; Chen, GuanHua

    2015-04-07

    Multiscale modeling has become a popular tool for research applying to different areas including materials science, microelectronics, biology, chemistry, etc. In this tutorial review, we describe a newly developed multiscale computational method, incorporating quantum mechanics into electronic device modeling with the electromagnetic environment included through classical electrodynamics. In the quantum mechanics/electromagnetics (QM/EM) method, the regions of the system where active electron scattering processes take place are treated quantum mechanically, while the surroundings are described by Maxwell's equations and a semiclassical drift-diffusion model. The QM model and the EM model are solved, respectively, in different regions of the system in a self-consistent manner. Potential distributions and current densities at the interface between QM and EM regions are employed as the boundary conditions for the quantum mechanical and electromagnetic simulations, respectively. The method is illustrated in the simulation of several realistic systems. In the case of junctionless field-effect transistors, transfer characteristics are obtained and a good agreement between experiments and simulations is achieved. Optical properties of a tandem photovoltaic cell are studied and the simulations demonstrate that multiple QM regions are coupled through the classical EM model. Finally, the study of a carbon nanotube-based molecular device shows the accuracy and efficiency of the QM/EM method.

  11. Method for numerical simulation of two-term exponentially correlated colored noise

    International Nuclear Information System (INIS)

    Yilmaz, B.; Ayik, S.; Abe, Y.; Gokalp, A.; Yilmaz, O.

    2006-01-01

    A method for numerical simulation of two-term exponentially correlated colored noise is proposed. The method is an extension of traditional method for one-term exponentially correlated colored noise. The validity of the algorithm is tested by comparing numerical simulations with analytical results in two physical applications

  12. Comparative Study on Two Melting Simulation Methods: Melting Curve of Gold

    International Nuclear Information System (INIS)

    Liu Zhong-Li; Li Rui; Sun Jun-Sheng; Zhang Xiu-Lu; Cai Ling-Cang

    2016-01-01

    Melting simulation methods are of crucial importance to determining melting temperature of materials efficiently. A high-efficiency melting simulation method saves much simulation time and computational resources. To compare the efficiency of our newly developed shock melting (SM) method with that of the well-established two-phase (TP) method, we calculate the high-pressure melting curve of Au using the two methods based on the optimally selected interatomic potentials. Although we only use 640 atoms to determine the melting temperature of Au in the SM method, the resulting melting curve accords very well with the results from the TP method using much more atoms. Thus, this shows that a much smaller system size in SM method can still achieve a fully converged melting curve compared with the TP method, implying the robustness and efficiency of the SM method. (paper)

  13. Dose point kernel simulation for monoenergetic electrons and radionuclides using Monte Carlo techniques.

    Science.gov (United States)

    Wu, J; Liu, Y L; Chang, S J; Chao, M M; Tsai, S Y; Huang, D E

    2012-11-01

    Monte Carlo (MC) simulation has been commonly used in the dose evaluation of radiation accidents and for medical purposes. The accuracy of simulated results is affected by the particle-tracking algorithm, cross-sectional database, random number generator and statistical error. The differences among MC simulation software packages must be validated. This study simulated the dose point kernel (DPK) and the cellular S-values of monoenergetic electrons ranging from 0.01 to 2 MeV and the radionuclides of (90)Y, (177)Lu and (103 m)Rh, using Fluktuierende Kaskade (FLUKA) and MC N-Particle Transport Code Version 5 (MCNP5). A 6-μm-radius cell model consisting of the cell surface, cytoplasm and cell nucleus was constructed for cellular S-value calculation. The mean absolute percentage errors (MAPEs) of the scaled DPKs, simulated using FLUKA and MCNP5, were 7.92, 9.64, 4.62, 3.71 and 3.84 % for 0.01, 0.1, 0.5, 1 and 2 MeV, respectively. For the three radionuclides, the MAPEs of the scaled DPKs were within 5 %. The maximum deviations of S(N←N), S(N←Cy) and S(N←CS) for the electron energy larger than 10 keV were 6.63, 6.77 and 5.24 %, respectively. The deviations for the self-absorbed S-values and cross-dose S-values of the three radionuclides were within 4 %. On the basis of the results of this study, it was concluded that the simulation results are consistent between FLUKA and MCNP5. However, there is a minor inconsistency for low energy range. The DPK and the cellular S-value should be used as the quality assurance tools before the MC simulation results are adopted as the gold standard.

  14. Methods for simulation-based analysis of fluid-structure interaction.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonal decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.

  15. Aqueous-Phase Synthesis of Silver Nanodiscs and Nanorods in Methyl Cellulose Matrix: Photophysical Study and Simulation of UV–Vis Extinction Spectra Using DDA Method

    Directory of Open Access Journals (Sweden)

    Sarkar Priyanka

    2010-01-01

    Full Text Available Abstract We present a very simple and effective way for the synthesis of tunable coloured silver sols having different morphologies. The procedure is based on the seed-mediated growth approach where methyl cellulose (MC has been used as soft-template in the growth solution. Nanostructures of varying morphologies as well as colour of the silver sols are controlled by altering the concentration of citrate in the growth solution. Similar to the polymers in the solution, citrate ions also dynamically adsorbed on the growing silver nanoparticles and promote one (1-D and two-dimensional (2-D growth of nanoparticles. Silver nanostructures are characterized using UV–vis and HR-TEM spectroscopic study. Simulation of the UV–vis extinction spectra of our synthesized silver nanostructures has been carried out using discrete dipole approximation (DDA method.

  16. Benchmarking HRA methods against different NPP simulator data

    International Nuclear Information System (INIS)

    Petkov, Gueorgui; Filipov, Kalin; Velev, Vladimir; Grigorov, Alexander; Popov, Dimiter; Lazarov, Lazar; Stoichev, Kosta

    2008-01-01

    The paper presents both international and Bulgarian experience in assessing HRA methods, underlying models approaches for their validation and verification by benchmarking HRA methods against different NPP simulator data. The organization, status, methodology and outlooks of the studies are described

  17. Hybrid statistics-simulations based method for atom-counting from ADF STEM images

    Energy Technology Data Exchange (ETDEWEB)

    De wael, Annelies, E-mail: annelies.dewael@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); De Backer, Annick [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Jones, Lewys; Nellist, Peter D. [Department of Materials, University of Oxford, Parks Road, OX1 3PH Oxford (United Kingdom); Van Aert, Sandra, E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2017-06-15

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. - Highlights: • A hybrid method for atom-counting from ADF STEM images is introduced. • Image simulations are incorporated into a statistical framework in a reliable manner. • Limits of the existing methods for atom-counting are far exceeded. • Reliable counting results from an experimental low dose image are obtained. • Progress towards reliable quantitative analysis of beam-sensitive materials is made.

  18. Multi-time scale Climate Informed Stochastic Hybrid Simulation-Optimization Model (McISH model) for Multi-Purpose Reservoir System

    Science.gov (United States)

    Lu, M.; Lall, U.

    2013-12-01

    decadal flow simulations are re-initialized every year with updated climate projections to improve the reliability of the operation rules for the next year, within which the seasonal operation strategies are nested. The multi-level structure can be repeated for monthly operation with weekly subperiods to take advantage of evolving weather forecasts and seasonal climate forecasts. As a result of the hierarchical structure, sub-seasonal even weather time scale updates and adjustment can be achieved. Given an ensemble of these scenarios, the McISH reservoir simulation-optimization model is able to derive the desired reservoir storage levels, including minimum and maximum, as a function of calendar date, and the associated release patterns. The multi-time scale approach allows adaptive management of water supplies acknowledging the changing risks, meeting both the objectives over the decade in expected value and controlling the near term and planning period risk through probabilistic reliability constraints. For the applications presented, the target season is the monsoon season from June to September. The model also includes a monthly flood volume forecast model, based on a Copula density fit to the monthly flow and the flood volume flow. This is used to guide dynamic allocation of the flood control volume given the forecasts.

  19. On coincidence of Pettis and McShane integrability

    Czech Academy of Sciences Publication Activity Database

    Fabian, Marián

    2015-01-01

    Roč. 65, č. 1 (2015), s. 83-106 ISSN 0011-4642 R&D Projects: GA ČR(CZ) GAP201/12/0290 Institutional support: RVO:67985840 Keywords : Pettis integral * McShane integral * MC-filling family Subject RIV: BA - General Mathematics Impact factor: 0.284, year: 2015 http://link.springer.com/article/10.1007/s10587-015-0161-x

  20. Hesburger asub ründama, McDonalds teeb kaevikuid / Andres Reimer

    Index Scriptorium Estoniae

    Reimer, Andres

    2005-01-01

    Hesburger tahab Eestis oma restoranide hulka kolmekordistada. McDonald's Eestis ei laiene, kuid kavatseb ettevõtte efektiivsust suurendada. Eesti hamburgerirestoranide kett Nehatu on konkurentsist välja langenud. Tabelid: Tulemused; Hamburgerirestoranid Eestis; Hamburger ja McDonald's maailmas. Lisad: Hakkas Hesburgeri partneriks; Hakka McDonald'si partneriks