A standard event class for Monte Carlo generators
StdHepC++ is a CLHEP Monte Carlo event class library which provides a common interface to Monte Carlo event generators. This work is an extensive redesign of the StdHep Fortran interface to use the full power of object oriented design. A generated event maps naturally onto the Directed Acyclic Graph concept and we have used the HepMC classes to implement this. The full implementation allows the user to combine events to simulate beam pileup and access them transparently as though they were a single event
MEVSIM: A Monte Carlo Event Generator for STAR
Ray, R. L.; Longacre, R. S.
2000-01-01
A fast, simple to use Monte Carlo based event generator is presented which is intended to facilitate simulation studies and the development of analysis software for the Solenoidal Tracker at RHIC (Relativistic Heavy Ion Collider) (STAR) experiment at the Brookhaven National Laboratory (BNL). This new event generator provides a fast, convenient means for producing large numbers of uncorrelated A+A collision events which can be used for a variety of applications in STAR, including quality assur...
Monte Carlo event generators for hadron-hadron collisions
Knowles, I.G. [Argonne National Lab., IL (United States). High Energy Physics Div.; Protopopescu, S.D. [Brookhaven National Lab., Upton, NY (United States)
1993-06-01
A brief review of Monte Carlo event generators for simulating hadron-hadron collisions is presented. Particular emphasis is placed on comparisons of the approaches used to describe physics elements and identifying their relative merits and weaknesses. This review summarizes a more detailed report.
MEVSIM A Monte Carlo Event Generator for STAR
Ray, R L
2000-01-01
A fast, simple to use Monte Carlo based event generator is presented which is intended to facilitate simulation studies and the development of analysis software for the Solenoidal Tracker at RHIC (Relativistic Heavy Ion Collider) (STAR) experiment at the Brookhaven National Laboratory (BNL). This new event generator provides a fast, convenient means for producing large numbers of uncorrelated A+A collision events which can be used for a variety of applications in STAR, including quality assurance evaluation of event reconstruction software, determination of detector acceptances and tracking efficiencies, physics analysis of event-by-event global variables, studies of strange, rare and exotic particle reconstruction, and so on. The user may select the number of events, the particle types, the multiplicities, the one-body momentum space distributions and the detector acceptance ranges. The various algorithms used in the code and its capabilities are explained. Additional user information is also discussed. The ...
Monte-Carlo event generation for the LHC
Siegert, Frank
This thesis discusses recent developments for the simulation of particle physics in the light of the start-up of the Large Hadron Collider. Simulation programs for fully exclusive events, dubbed Monte-Carlo event generators, are improved in areas related to the perturbative as well as non-perturbative regions of strong interactions. A short introduction to the main principles of event generation is given to serve as a basis for the following discussion. An existing algorithm for the correction of parton-shower emissions with the help of exact tree-level matrix elements is revisited and significantly improved as attested by first results. In a next step, an automated implementation of the POWHEG method is presented. It allows for the combination of parton showers with full next-to-leading order QCD calculations and has been tested in several processes. These two methods are then combined into a more powerful framework which allows to correct a parton shower with full next-to-leading order matrix elements and h...
Top Quark Mass Calibration for Monte Carlo Event Generators
Butenschoen, Mathias; Hoang, Andre H; Mateu, Vicent; Preisser, Moritz; Stewart, Iain W
2016-01-01
The most precise top quark mass measurements use kinematic reconstruction methods, determining the top mass parameter of a Monte Carlo event generator, $m_t^{\\rm MC}$. Due to hadronization and parton shower dynamics, relating $m_t^{\\rm MC}$ to a field theory mass is difficult. We present a calibration procedure to determine this relation using hadron level QCD predictions for observables with kinematic mass sensitivity. Fitting $e^+e^-$ 2-Jettiness calculations at NLL/NNLL order to Pythia 8.205, $m_t^{\\rm MC}$ differs from the pole mass by $900$/$600$ MeV, and agrees with the MSR mass within uncertainties, $m_t^{\\rm MC}\\simeq m_{t,1\\,{\\rm GeV}}^{\\rm MSR}$.
How-to: Write a parton-level Monte Carlo event generator
Papaefstathiou, Andreas
2014-01-01
This article provides an introduction to the principles of particle physics event generators that are based on the Monte Carlo method. Following some preliminaries, instructions on how to built a basic parton-level Monte Carlo event generator are given through exercises.
Reliability of QCD Monte-Carlo event generators
The author examines the extent to which Monte-Carlo simulations reproduce the predictions of perturbative QCD especially in the case of very high energy hadron-hadron scattering. Although the Monte-Carlos have great success in reproducing most of the qualitative features of the theory, they do not fully incorporate even the leading logarithmic approximation. Work is needed to give a systematic method for inclusion of both leading and non-leading effects
Validation of Monte Carlo event generators in the ATLAS Collaboration for LHC Run 2
The ATLAS collaboration
2016-01-01
This note reviews the main steps followed by the ATLAS Collaboration to validate the properties of particle-level simulated events from Monte Carlo event generators in order to ensure the correctness of all event generator configurations and production samples used in physics analyses. A central validation procedure is adopted which permits the continual validation of the functionality and the performance of the ATLAS event simulation infrastructure. Revisions and updates of the Monte Carlo event generators are also monitored. The methodology behind the validation and tools developed for that purpose, as well as various usage cases, are presented. The strategy has proven to play an essential role in identifying possible problems or unwanted features within a restricted timescale, verifying their origin and pointing to possible bug fixes before full-scale processing is initiated.
Monte Carlo simulation of primary reactions on HPLUS based on pluto event generator
Hadron Physics Lanzhou Spectrometer (HPLUS) is designed for the study of hadron production and decay from nucleon-nucleon interaction in the GeV region. The current formation of HPLUS and the particle identification methods for three polar angle regions are discussed. The Pluto event generator is applied to simulate the primary reactions on HPLUS, concerning four issues as followed: the agreement on pp elastic scattering angular distribution between Pluto samples and experimental data; the acceptance of charged K mesons in the strangeness production channels for the forward region of HPLUS; the dependence of the maximum energy of photons and the minimum vertex angle of two photons on the polar angle; the influence on the mass spectrum of excited states of nucleon with large resonant width from different reconstruction methods. It is proved that the Pluto event generator satisfies the requirements of Monte Carlo simulation for HPLUS. (authors)
An object-oriented framework for the hadronic Monte-Carlo event generators
We advocate the development of an object-oriented framework for the hadronic Monte-Carlo (MC) event generators. The hadronic MC user and developer requirements are discussed as well as the hadronic model commonalities. It is argued that the development of a framework is in favour of taking into account of model commonalities since common means are stable and can be developed only at once. Such framework can provide different possibilities to have user session more convenient and productive, e.g., an easy access and edition of any model parameter, substitution of the model components by the alternative model components without changing the code, customized output, which offers either full information about history of generated event or specific information about reaction final state, etc. Such framework can indeed increase the productivity of a hadronic model developer, particularly, due to the formalization of the hadronic model component structure and model component collaborations. The framework based on the component approach opens a way to organize a library of the hadronic model components, which can be considered as the pool of hadronic model building blocks. Basic features, code structure and working examples of the first framework version for the hadronic MC models, which has been built as the starting point, are shortly explained
Report on the work of the 'Monte Carlo Event Generation' subgroup
The work of the Monte Carlo Event Generation includes the preparation of programs, jet simulation, track generation in chambers, and the pattern recognition of tracks and track fitting. Some general results from the jet simulation by Ali et al. are given. The total energy used was 60 GeV, and the top quark mass was assumed to be 25 GeV. The multiplicity of charged particles and photons is shown. The multiplicity increased with the number of jets. The energy spectra and the trajectories of charged particles and photons were obtained. The distribution of the opening angle of any two photons is also presented. The track generation program used is GEANT from CERN. This program was adapted to the KEK computer. Pattern recognition and track fitting are based on the tracking device. The program considered was that by DELCO group at SLAC. The tracking device consists of a MWPC and a cylindrical drift chamber with wires along the beam direction Z and wires inclined at a stereo angle. Some comments on vertex detectors are given. (Kato, T.)
PHANTOM: A Monte Carlo event generator for six parton final states at high energy colliders
Ballestrero, Alessandro; Belhouari, Aissa; Bevilacqua, Giuseppe; Kashkan, Vladimir; Maina, Ezio
2009-03-01
PHANTOM is a tree level Monte Carlo for six parton final states at proton-proton, proton-antiproton and electron-positron colliders at O(αEM6) and O(αEM4αS2) including possible interferences between the two sets of diagrams. This comprehends all purely electroweak contributions as well as all contributions with one virtual or two external gluons. It can generate unweighted events for any set of processes and it is interfaced to parton shower and hadronization packages via the latest Les Houches Accord protocol. It can be used to analyze the physics of boson-boson scattering, Higgs boson production in boson-boson fusion, tt¯ and three boson production. Program summaryProgram title:PHANTOM (V. 1.0) Catalogue identifier: AECE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 175 787 No. of bytes in distributed program, including test data, etc.: 965 898 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any with a UNIX, LINUX compatible Fortran compiler Operating system: UNIX, LINUX RAM: 500 MB Classification: 11.1 External routines: LHAPDF (Les Houches Accord PDF Interface, http://projects.hepforge.org/lhapdf/), CIRCE (beamstrahlung for ee ILC collider). Nature of problem: Six fermion final state processes have become important with the increase of collider energies and are essential for the study of top, Higgs and electroweak symmetry breaking physics at high energy colliders. Since thousands of Feynman diagrams contribute in a single process and events corresponding to hundreds of different final states need to be generated, a fast and stable calculation is needed. Solution method:PHANTOM is a tree level Monte Carlo for six parton final states at proton-proton, proton-antiproton and
ExHuME 1.3: A Monte Carlo event generator for exclusive diffraction
Monk, J.; Pilkington, A.
2006-08-01
We have written the Exclusive Hadronic Monte Carlo Event (ExHuME) generator. ExHuME is based around the perturbative QCD calculation of Khoze, Martin and Ryskin of the process pp→p+X+p, where X is a centrally produced colour singlet system. Program summaryTitle of program:ExHuME Catalogue identifier:ADYA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYA_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:None Programming language used:C++, some FORTRAN Computer:Any computer with UNIX capability. Users should refer to the README file distributed with the source code for further details Operating system:Linux, Mac OS X No. of lines in distributed program, including test data, etc.:111 145 No. of bytes in distributed program, including test data, etc.: 791 085 Distribution format:tar.gz RAM:60 MB External routines/libraries:LHAPDF [ http://durpdg.dur.ac.uk/lhapdf/], CLHEP v1.8 or v1.9 [L. Lönnblad, Comput. Phys. Comm. 84 (1994) 307; http://wwwinfo.cern.ch/asd/lhc++/clhep/] Subprograms used:Pythia [T. Sjostrand et al., Comput. Phys. Comm. 135 (2001) 238], HDECAY [A. Djouadi, J. Kalinowski, M. Spira, HDECAY: A program for Higgs boson decays in the standard model and its supersymmetric extension, Comput. Phys. Comm. 108 (1998) 56, hep-ph/9704448]. Both are distributed with the source code Nature of problem:Central exclusive production offers the opportunity to study particle production in a uniquely clean environment for a hadron collider. This program implements the KMR model [V.A. Khoze, A.D. Martin, M.G. Ryskin, Prospects for New Physics observations in diffractive processes at the LHC and Tevatron, Eur. Phys. J. C 23 (2002) 311, hep-ph/0111078], which is the only fully perturbative model of exclusive production. Solution method:Monte Carlo techniques are used to produce the central exclusive parton level system. Pythia routines are then used to develop a realistic hadronic system
Automating methods to improve precision in Monte-Carlo event generation for particle colliders
Gleisberg, Tanju
2008-07-01
The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove
Automating methods to improve precision in Monte-Carlo event generation for particle colliders
The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove
PING Rong-Gang
2008-01-01
We present a brief remark and introduction to event generators for tau-charm physics currently used at BESⅢ,including KKMC,BesEvtGen,Bhlumi,Bhwide,Babayaga and inclusive Monte-Carlo event generators.This paper provides basic information on event generators for BESⅢ users.
Gutschow, Christian; The ATLAS collaboration
2016-01-01
The Monte Carlo setups used by ATLAS to model boson+jets and multi-boson processes in 13 TeV pp collisions are described. Comparisons between data and several events generators are provided for key kinematic distributions at 7 TeV, 8 TeV and 13 TeV. Issues associated with sample normalisation and the evaluation of systematic uncertainties are also discussed.
Leung, Yue Hang; Dion, Alan; Drees, Axel; Sharma, Deepali
2015-10-01
Heavy flavor is one of the most sought observables to study the properties of the hot and dense medium created in heavy-ion collisions. A variety of heavy-flavor (charm and bottom) related measurements in different collision systems, as well as different collision energies have been measured at RHIC. However, the total and differential charm and bottom cross-sections are still not understood in detail. We present a comprehensive study of all the heavy flavor measurements in p+ p collisions at RHIC at √{sNN} = 200 GeV. We compare the measured charm and bottom pT, rapidity, and correlation distributions to three different Monte-Carlo event generators, PYTHIA, MC@NLO and POWHEG. Various data sets are fitted to the spectral shapes from these event generators with the charm and bottom cross-sections as free parameters. Although the spectral shapes are well described in general, the normalization of the simulated samples are different between data sets describing different regions of phase space. These measurements suggest that while current Monte-Carlo event generators describe experimental data near mid rapidity, they are inconsistent when compared over a wide range in phase space.
Monte Carlo event generation of photon-photon collisions at colliders
Helenius, Ilkka
2015-01-01
In addition to being interesting in itself, the photon-photon interactions will be an inevitable background for the future electron-positron colliders. Thus to be able to quantify the potential of future electron-positron colliders it is important to have an accurate description of these collisions. Here we present our ongoing work to implement the photon-photon collisions in the Pythia 8 event generator. First we introduce photon PDFs in general and then discuss in more detail one particular set we have used in our studies. Then we will discuss how the parton-shower algorithm in Pythia 8 is modified in case of photon beams and how the beam remnants are constructed. Finally a brief outlook on future developments is given.
A Monte Carlo event generator is implemented for a two-Higgs-doublet model with maximal CP symmetry, the MCPM. The model contains five physical Higgs bosons; the ρ', behaving similarly to the standard-model Higgs boson, two extra neutral bosons h' and h'', and a charged pair H±. The special feature of the MCPM is that, concerning the Yukawa couplings, the bosons h', h'' and H± couple directly only to the second-generation fermions but with strengths given by the third-generation-fermion masses. Our event generator allows the simulation of the Drell-Yan-type production processes of h', h'' and H± in proton-proton collisions at LHC energies. Also the subsequent leptonic decays of these bosons into the μ+ μ-, μ+νμ and μ- anti νμ channels are studied as well as the dominant background processes. We estimate the integrated luminosities needed in pp collisions at center-of-mass energies of 8 and 14 TeV for significant observations of the Higgs bosons h', h'' and H± in these muonic channels. (orig.)
Use of Monte Carlo Event Generators for the study of 13 TeV pp collisions by ATLAS
Thompson, Paul; The ATLAS collaboration
2016-01-01
The use of NLO and multileg Monte Carlo generators by the ATLAS experiment in the analysis of 13 TeV data is discussed. Procedures to validate these generators by comparing results obtained using data collected at 7 TeV, 8 TeV and 13 TeV to the generator predictions are described. Techniques used to evaluate systematic uncertainties on Monte Carlo modelling are also discussed.
Wilk, G. [Soltan Inst. for Nuclear Studies, Otwock-Swierk (Poland); Wlodarczyk, Z. [Wyzsza Szkola Pedagogiczna, Kielce (Poland); Weiner, R.M. [Marburg Univ. (Germany). Physikalisches Inst.
1993-09-01
The quantum statistical concepts of coherence and chaos used to describe fluctuations in multiparticle production processes are exhibited in a generalized version of the Interacting Gluon Model (IGM). The two components in the multiplicity distributions found previously are derived in this approach as originating from the two components of the energy flow existing in the IGM: the one provided by the interacting gluons and the other containing the initial valence quarks of the projectiles. The first Monte Carlo event generator with coherence and chaos built in is then presented and successfully used to describe the pp and p p-bar data. It is then generalized to hA and AA collisions and compared with data. (author). 62 refs, 16 figs, 1 tab.
The quantum statistical concepts of coherence and chaos used to describe fluctuations in multiparticle production processes are exhibited in a generalized version of the Interacting Gluon Model (IGM). The two components in the multiplicity distributions found previously are derived in this approach as originating from the two components of the energy flow existing in the IGM: the one provided by the interacting gluons and the other containing the initial valence quarks of the projectiles. The first Monte Carlo event generator with coherence and chaos built in is then presented and successfully used to describe the pp and p p-bar data. It is then generalized to hA and AA collisions and compared with data. (author). 62 refs, 16 figs, 1 tab
A comparison of Monte Carlo generators
Golan, Tomasz
2014-01-01
A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.
Monte Carlo generators in ATLAS software
This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.
Due to their ability to provide detailed and quantitative predictions, the event generators have become an important part of studying relativistic heavy ion physics and of designing future experiments. In this talk, the author will briefly summarize recent progress in developing event generators for the relativistic heavy ion collisions
Rare event simulation using Monte Carlo methods
Rubino, Gerardo
2009-01-01
In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...
The CCFM Monte Carlo generator CASCADE 2.2.0
Jung, H; Deak, M; Grebenyuk, A; Hautmann, F; Hentschinski, M; Knutsson, A; Kraemer, M; Kutak, K; Lipatov, A; Zotov, N
2010-01-01
CASCADE is a full hadron level Monte Carlo event generator for ep, \\gamma p and p\\bar{p} and pp processes, which uses the CCFM evolution equation for the initial state cascade in a backward evolution approach supplemented with off - shell matrix elements for the hard scattering. A detailed program description is given, with emphasis on parameters the user wants to change and variables which completely specify the generated events.
We present a Monte Carlo program EPOCS (Electron POsitron Collision Simulator), which generates e+e- events in high energy region that will be explored by TRISTAN project. Special emphasis is put on the effect of Z0 and possible top quark resonances. The user can control the simulation by selecting the energy and other parameters. Also he can easily incorpolate a new process and/or particles into the program. The central part of this report is a detailed description on the structure and the usage of the program. We would like to stress that the hadronization is based on a number of assumptions, which are made as clear as possible here. (author)
Higgs boson events and background lep. A Monte Carlo study
Higgs boson production at LEP using e+ e- to Z 0 to H0 + e+ e- has been studied by Monte Carlo generation of events with realistic errors of measurement added. The results show the recoil mass (Higgs boson mass) resolution to be reasonably good for boson masses bigger than 5 Ge V. The events are found to populate a phase space region free of physical background for all boson masses below about 35 GeV. For masses above 40 GeV the Higgs boson signal merges with the physical background produced by semileptonic decays of heavy flavour quarks while diminishing in strength to low levels. The geometrical acceptance of a detector like DELPHI is about 80 per cent for Higgs boson events. (Author)
Academic Training: Monte Carlo generators for the LHC
Françoise Benz
2005-01-01
2004-2005 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 4, 5, 6, 7 April from 11.00 to 12.00 hrs - Main Auditorium, bldg. 500 Monte Carlo generators for the LHC T. SJOSTRAND / CERN-PH, Lund Univ. SE Event generators today are indispensable as tools for the modelling of complex physics processes, that jointly lead to the production of hundreds of particles per event at LHC energies. Generators are used to set detector requirements, to formulate analysis strategies, or to calculate acceptance corrections. These lectures describe the physics that goes into the construction of an event generator, such as hard processes, initial- and final-state radiation, multiple interactions and beam remnants, hadronization and decays, and how these pieces come together. The current main generators are introduced, and are used to illustrate uncertainties in the physics modelling. Some trends for the future are outlined. ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch
HepMCAnalyser: A tool for Monte Carlo generator validation
HepMCAnalyser is a tool for Monte Carlo (MC) generator validation and comparisons. It is a stable, easy-to-use and extendable framework allowing for easy access/integration to generator level analysis. It comprises a class library with benchmark physics processes to analyse MC generator HepMC output and to fill root histograms. A web-interface is provided to display all or selected histogramms, compare to references and validate the results based on Kolmogorov Tests. Steerable example programs can be used for event generation. The default steering is tuned to optimally align the distributions of the different MC generators. The tool will be used for MC generator validation by the Generator Services (GENSER) LCG project, e.g. for version upgrades. It is supported on the same platforms as the GENSER libraries and is already in use at ATLAS.
Event generator for pp interactions
Phenomenological event generator for pp interactions at energy Elab=70 GeV was created. It is based on the Gaussian form of the matrix element of the interaction. The final states involve two protons and pions. Parameters of the generator are fitted for experimental cross-section data. The energy and momentum conservation laws are strongly satisfied. The event generator provides the smallness of the transverse momentum of the final particles
New developments in event generator tuning techniques
Buckley, Andy; Lacker, Heiko; Schulz, Holger; von Seggern, Jan Eike
2010-01-01
Data analyses in hadron collider physics depend on background simulations performed by Monte Carlo (MC) event generators. However, calculational limitations and non-perturbative effects require approximate models with adjustable parameters. In fact, we need to simultaneously tune many phenomenological parameters in a high-dimensional parameter-space in order to make the MC generator predictions fit the data. It is desirable to achieve this goal without spending too much time or computing resources iterating parameter settings and comparing the same set of plots over and over again. We present extensions and improvements to the MC tuning system, Professor, which addresses the aforementioned problems by constructing a fast analytic model of a MC generator which can then be easily fitted to data. Using this procedure it is for the first time possible to get a robust estimate of the uncertainty of generator tunings. Furthermore, we can use these uncertainty estimates to study the effect of new (pseudo-) data on t...
Optimization of next-event estimation probability in Monte Carlo shielding calculations
In Monte Carlo radiation transport calculations with point detectors, the next-event estimation is employed to estimate the response to each detector from all collision sites. The computation time required for this estimation process is substantial and often exceeds the time required to generate and process particle histories in a calculation. This estimation from all collision sites is, therefore, very wasteful in Monte Carlo shielding calculations. For example, in the source region and in regions far away from the detectors, the next-event contribution of a particle is often very small and insignificant. A method for reducing this inefficiency is described
Kersevan, Borut Paul; Richter-Waş, Elzbieta
2013-03-01
The AcerMC Monte Carlo generator is dedicated to the generation of Standard Model background processes which were recognised as critical for the searches at LHC, and generation of which was either unavailable or not straightforward so far. The program itself provides a library of the massive matrix elements (coded by MADGRAPH) and native phase space modules for generation of a set of selected processes. The hard process event can be completed by the initial and the final state radiation, hadronisation and decays through the existing interface with either PYTHIA, HERWIG or ARIADNE event generators and (optionally) TAUOLA and PHOTOS. Interfaces to all these packages are provided in the distribution version. The phase-space generation is based on the multi-channel self-optimising approach using the modified Kajantie-Byckling formalism for phase space construction and further smoothing of the phase space was obtained by using a modified ac-VEGAS algorithm. An additional improvement in the recent versions is the inclusion of the consistent prescription for matching the matrix element calculations with parton showering for a select list of processes. Catalogue identifier: ADQQ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADQQ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3853309 No. of bytes in distributed program, including test data, etc.: 68045728 Distribution format: tar.gz Programming language: FORTRAN 77 with popular extensions (g77, gfortran). Computer: All running Linux. Operating system: Linux. Classification: 11.2, 11.6. External routines: CERNLIB (http://cernlib.web.cern.ch/cernlib/), LHAPDF (http://lhapdf.hepforge.org/) Catalogue identifier of previous version: ADQQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 149(2003)142 Does
Studying the information content of TMDs using Monte Carlo generators
Avakian, H. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Matevosyan, H. [The Univ. of Adelaide, Adelaide (Australia); Pasquini, B. [Univ. of Pavia, Pavia (Italy); Schweitzer, P. [Univ. of Connecticut, Storrs, CT (United States)
2015-02-05
Theoretical advances in studies of the nucleon structure have been spurred by recent measurements of spin and/or azimuthal asymmetries worldwide. One of the main challenges still remaining is the extraction of the parton distribution functions, generalized to describe transverse momentum and spatial distributions of partons from these observables with no or minimal model dependence. In this topical review we present the latest developments in the field with emphasis on requirements for Monte Carlo event generators, indispensable for studies of the complex 3D nucleon structure, and discuss examples of possible applications.
Studying the information content of TMDs using Monte Carlo generators
Theoretical advances in studies of the nucleon structure have been spurred by recent measurements of spin and/or azimuthal asymmetries worldwide. One of the main challenges still remaining is the extraction of the parton distribution functions, generalized to describe transverse momentum and spatial distributions of partons from these observables with no or minimal model dependence. In this review we present the latest developments in the field with emphasis on requirements for Monte Carlo event generators, indispensable for studies of the complex 3D nucleon structure, and discuss examples of possible applications. (paper)
Event-by-event Monte Carlo simulation of radiation transport in vapor and liquid water
Papamichael, Georgios Ioannis
A Monte-Carlo Simulation is presented for Radiation Transport in water. This process is of utmost importance, having applications in oncology and therapy of cancer, in protecting people and the environment, waste management, radiation chemistry and on some solid-state detectors. It's also a phenomenon of interest in microelectronics on satellites in orbit that are subject to the solar radiation and in space-craft design for deep-space missions receiving background radiation. The interaction of charged particles with the medium is primarily due to their electromagnetic field. Three types of interaction events are considered: Elastic scattering, impact excitation and impact ionization. Secondary particles (electrons) can be generated by ionization. At each stage, along with the primary particle we explicitly follow all secondary electrons (and subsequent generations). Theoretical, semi-empirical and experimental formulae with suitable corrections have been used in each case to model the cross sections governing the quantum mechanical process of interactions, thus determining stochastically the energy and direction of outgoing particles following an event. Monte-Carlo sampling techniques have been applied to accurate probability distribution functions describing the primary particle track and all secondary particle-medium interaction. A simple account of the simulation code and a critical exposition of its underlying assumptions (often missing in the relevant literature) are also presented with reference to the model cross sections. Model predictions are in good agreement with existing computational data and experimental results. By relying heavily on a theoretical formulation, instead of merely fitting data, it is hoped that the model will be of value in a wider range of applications. Possible future directions that are the object of further research are pointed out.
Modern Particle Physics Event Generation with WHIZARD
Reuter, J; Nejad, B Chokoufe; Kilian, W; Ohl, T; Sekulla, M; Weiss, C
2014-01-01
We describe the multi-purpose Monte-Carlo event generator WHIZARD for the simulation of high-energy particle physics experiments. Besides the presentation of the general features of the program like SM physics, BSM physics, and QCD effects, special emphasis will be given to the support of the most accurate simulation of the collider environments at hadron colliders and especially at future linear lepton colliders. On the more technical side, the very recent code refactoring towards a completely object-oriented software package to improve maintainability, flexibility and code development will be discussed. Finally, we present ongoing work and future plans regarding higher-order corrections, more general model support including the setup to search for new physics in vector boson scattering at the LHC, as well as several lines of performance improvements.
Modern particle physics event generation with WHIZARD
Reuter, J.; Bach, F.; Chokoufe, B. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Kilian, W.; Sekulla, M. [Siegen Univ. (Germany). Dept. of Physics; Ohl, T. [Wuerzburg Univ. (Germany). Dept. of Physics and Astronomy; Weiss, C. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Siegen Univ. (Germany). Dept. of Physics
2014-10-16
We describe the multi-purpose Monte-Carlo event generator WHIZARD for the simulation of high-energy particle physics experiments. Besides the presentation of the general features of the program like SM physics, BSM physics, and QCD effects, special emphasis is given to the support of the most accurate simulation of the collider environments at hadron colliders and especially at future linear lepton colliders. On the more technical side, the very recent code refactoring towards a completely object-oriented software package to improve maintainability, flexibility and code development are discussed. Finally, we present ongoing work and future plans regarding higher-order corrections, more general model support including the setup to search for new physics in vector boson scattering at the LHC, as well as several lines of performance improvements.
Modern particle physics event generation with WHIZARD
We describe the multi-purpose Monte-Carlo event generator WHIZARD for the simulation of high-energy particle physics experiments. Besides the presentation of the general features of the program like SM physics, BSM physics, and QCD effects, special emphasis is given to the support of the most accurate simulation of the collider environments at hadron colliders and especially at future linear lepton colliders. On the more technical side, the very recent code refactoring towards a completely object-oriented software package to improve maintainability, flexibility and code development are discussed. Finally, we present ongoing work and future plans regarding higher-order corrections, more general model support including the setup to search for new physics in vector boson scattering at the LHC, as well as several lines of performance improvements.
Event generation with SHERPA 1.1
Gleisberg, T.; Hoche, Stefan.; Krauss, F.; Schoenherr, M.; Schumann, S.; Siegert, F.; Winter, J.
2008-12-18
In this paper the current release of the Monte Carlo event generator Sherpa, version 1.1, is presented. Sherpa is a general-purpose tool for the simulation of particle collisions at high-energy colliders. It contains a very flexible tree-level matrix-element generator for the calculation of hard scattering processes within the Standard Model and various new physics models. The emission of additional QCD partons off the initial and final states is described through a parton-shower model. To consistently combine multi-parton matrix elements with the QCD parton cascades the approach of Catani, Krauss, Kuhn and Webber is employed. A simple model of multiple interactions is used to account for underlying events in hadron-hadron collisions. The fragmentation of partons into primary hadrons is described using a phenomenological cluster-hadronization model. A comprehensive library for simulating tau-lepton and hadron decays is provided. Where available form-factor models and matrix elements are used, allowing for the inclusion of spin correlations; effects of virtual and real QED corrections are included using the approach of Yennie, Frautschi and Suura.
Generating target probability sequences and events
Ella, Vaignana Spoorthy
2013-01-01
Cryptography and simulation of systems require that events of pre-defined probability be generated. This paper presents methods to generate target probability events based on the oblivious transfer protocol and target probabilistic sequences using probability distribution functions.
Study on random number generator in Monte Carlo code
The Monte Carlo code uses a sequence of pseudo-random numbers with a random number generator (RNG) to simulate particle histories. A pseudo-random number has its own period depending on its generation method and the period is desired to be long enough not to exceed the period during one Monte Carlo calculation to ensure the correctness especially for a standard deviation of results. The linear congruential generator (LCG) is widely used as Monte Carlo RNG and the period of LCG is not so long by considering the increasing rate of simulation histories in a Monte Carlo calculation according to the remarkable enhancement of computer performance. Recently, many kinds of RNG have been developed and some of their features are better than those of LCG. In this study, we investigate the appropriate RNG in a Monte Carlo code as an alternative to LCG especially for the case of enormous histories. It is found that xorshift has desirable features compared with LCG, and xorshift has a larger period, a comparable speed to generate random numbers, a better randomness, and good applicability to parallel calculation. (author)
NiMax system for hadronic event generators in HEP
We have suggested a new approach to the development and use of Monte Carlo event generators in high-energy physics (HEP). It is a component approach, when a complex numerical model is composed of standard components. Our approach opens a way to organize a library of HEP model components and provides a great flexibility for the construction of very powerful and realistic numerical models. To support this approach we have designed the NiMax software system (framework) written in C++
Stochastic generation of hourly rainstorm events in Johor
Nojumuddin, Nur Syereena; Yusof, Fadhilah [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusop, Zulkifli [Institute of Environmental and Water Resources Management, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia)
2015-02-03
Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was used in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor.
Stochastic generation of hourly rainstorm events in Johor
Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was used in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor
Modelling hadronic interactions in cosmic ray Monte Carlo generators
Pierog Tanguy
2015-01-01
Full Text Available Currently the uncertainty in the prediction of shower observables for different primary particles and energies is dominated by differences between hadronic interaction models. The LHC data on minimum bias measurements can be used to test Monte Carlo generators and these new constraints will help to reduce the uncertainties in air shower predictions. In this article, after a short introduction on air showers and Monte Carlo generators, we will show the results of the comparison between the updated version of high energy hadronic interaction models EPOS LHC and QGSJETII-04 with LHC data. Results for air shower simulations and their consequences on comparisons with air shower data will be discussed.
Monte Carlo generators vs nuclear model
India-based Neutrino Observatory (INO) is planning to study atmospheric neutrino interaction using iron calorimeters. The iron plates will be magnetized to determine the charge of the muon produced by neutrino interaction inside the detector to look at νμ and ν-barμ events separately. The plan is to study precisely some of the neutrino oscillation parameters of the neutrino mixing matrix. Neutrino detection proceeds basically through various channels of interaction with hadronic targets like quasielastic scattering, meson production, resonance excitations, etc.
MadEvent: automatic event generation with MadGraph
We present a new multi-channel integration method and its implementation in the multi-purpose event generator MadEvent, which is based on MadGraph. Given a process, MadGraph automatically identifies all the relevant subprocesses, generates both the amplitudes and the mappings needed for an efficient integration over the phase space, and passes them to MadEvent. As a result, a process-specific, stand-alone code is produced that allows the user to calculate cross sections and produce unweighted events in a standard output format. Several examples are given for processes that are relevant for physics studies at present and forthcoming colliders. (author)
A united event grand canonical Monte Carlo study of partially doped polyaniline
Byshkin, M. S., E-mail: mbyshkin@unisa.it, E-mail: gmilano@unisa.it; Correa, A. [Modeling Lab for Nanostructure and Catalysis, Dipartimento di Chimica e Biologia and NANOMATES, University of Salerno, 84084, via Ponte don Melillo, Fisciano Salerno (Italy); Buonocore, F. [ENEA Casaccia Research Center, Via Anguillarese 301, 00123 Rome (Italy); Di Matteo, A. [STMicroelectronics, Via Remo de Feo, 1 80022 Arzano, Naples (Italy); IMAST Scarl Piazza Bovio 22, 80133 Naples (Italy); Milano, G., E-mail: mbyshkin@unisa.it, E-mail: gmilano@unisa.it [Modeling Lab for Nanostructure and Catalysis, Dipartimento di Chimica e Biologia and NANOMATES, University of Salerno, 84084, via Ponte don Melillo, Fisciano Salerno (Italy); IMAST Scarl Piazza Bovio 22, 80133 Naples (Italy)
2013-12-28
A Grand Canonical Monte Carlo scheme, based on united events combining protonation/deprotonation and insertion/deletion of HCl molecules is proposed for the generation of polyaniline structures at intermediate doping levels between 0% (PANI EB) and 100% (PANI ES). A procedure based on this scheme and subsequent structure relaxations using molecular dynamics is described and validated. Using the proposed scheme and the corresponding procedure, atomistic models of amorphous PANI-HCl structures were generated and studied at different doping levels. Density, structure factors, and solubility parameters were calculated. Their values agree well with available experimental data. The interactions of HCl with PANI have been studied and distribution of their energies has been analyzed. The procedure has also been extended to the generation of PANI models including adsorbed water and the effect of inclusion of water molecules on PANI properties has also been modeled and discussed. The protocol described here is general and the proposed United Event Grand Canonical Monte Carlo scheme can be easily extended to similar polymeric materials used in gas sensing and to other systems involving adsorption and chemical reactions steps.
NLO event generation for chargino production at the ILC
We present a Monte-Carlo event generator for simulating chargino pair-production at the International Linear Collider (ILC) at next-to-leading order in the electroweak couplings. By properly resumming photons in the soft and collinear regions, we avoid negative event weights, so the program can simulate physical (unweighted) event samples. Photons are explicitly generated throughout the range where they can be experimentally resolved. Inspecting the dependence on the cutoffs separating the soft and collinear regions, we evaluate the systematic errors due to soft and collinear approximations. In the resummation approach, the residual uncertainty can be brought down to the per-mil level, coinciding with the expected statistical uncertainty at the ILC. (Orig.)
EPEWAX - a Monte Carlo generator for W production in electron proton scattering
Theuer, E. (RWTH Aachen, 1. Physikalisches Institut (Germany))
1992-04-01
EPEWAX is a Monte Carlo event generator intended to simulate the production of free single W bosons at an eP collider according to the process: e[sup -]P [yields] e[sup -]Wsub([yields]f[sub 1]f[sub 2]) X. It offers the possibility to completely simulate W production down to the particle level. Because the WW-Photon couplings are not fixed it allows to study non standard model W production. (orig.).
EPEWAX - a Monte Carlo generator for W production in electron proton scattering
EPEWAX is a Monte Carlo event generator intended to simulate the production of free single W bosons at an eP collider according to the process: e-P → e-Wsub(→f1f2) X. It offers the possibility to completely simulate W production down to the particle level. Because the WW-Photon couplings are not fixed it allows to study non standard model W production. (orig.)
General-purpose event generators for LHC physics
Buckley, Andy; /Edinburgh U.; Butterworth, Jonathan; /University Coll. London; Gieseke, Stefan; /Karlsruhe U., ITP; Grellscheid, David; /Durham U., IPPP; Hoche, Stefan; /SLAC; Hoeth, Hendrik; Krauss, Frank; /Durham U., IPPP; Lonnblad, Leif; /Lund U., Dept. Theor. Phys. /CERN; Nurse, Emily; /University Coll. London; Richardson, Peter; /Durham U., IPPP; Schumann, Steffen; /Heidelberg U.; Seymour, Michael H.; /Manchester U.; Sjostrand, Torbjorn; /Lund U., Dept. Theor. Phys.; Skands, Peter; /CERN; Webber, Bryan; /Cambridge U.
2011-03-03
We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays; the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the Ariadne, Herwig++, Pythia 8 and Sherpa generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists wanting a deeper insight into the tools available for signal and background simulation at the LHC.
Status and developments of event generators
Sjöstrand, Torbjörn
2016-01-01
Event generators play a crucial role in the exploration of LHC physics. This presentation summarizes news and plans for the three general-purpose pp generators HERWIG, PYTHIA and SHERPA, as well as briefer notes on a few other generators. Common themes, such as the matching and merging between matrix elements and parton showers, are highlighted. Other topics include a historical introduction, from the Lund perspective, and comments on the role of MCnet.
A Bifurcation Monte Carlo Scheme for Rare Event Simulation
Liu, Hongliang
2016-01-01
The bifurcation method is a way to do rare event sampling -- to estimate the probability of events that are too rare to be found by direct simulation. We describe the bifurcation method and use it to estimate the transition rate of a double well potential problem. We show that the associated constrained path sampling problem can be addressed by a combination of Crooks-Chandler sampling and parallel tempering and marginalization.
Johannesson, G; Dyer, K; Hanley, W; Kosovic, B; Larsen, S; Loosmore, G; Lundquist, J; Mirin, A
2006-07-17
The release of hazardous materials into the atmosphere can have a tremendous impact on dense populations. We propose an atmospheric event reconstruction framework that couples observed data and predictive computer-intensive dispersion models via Bayesian methodology. Due to the complexity of the model framework, a sampling-based approach is taken for posterior inference that combines Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) strategies.
Childers, J T; LeCompte, T J; Papka, M E; Benjamin, D P
2015-01-01
As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application and the performance that was achieved.
Optical monitoring of rheumatoid arthritis: Monte Carlo generated reconstruction kernels
Minet, O.; Beuthan, J.; Hielscher, A. H.; Zabarylo, U.
2008-06-01
Optical imaging in biomedicine is governed by the light absorption and scattering interaction on microscopic and macroscopic constituents in the medium. Therefore, light scattering characteristics of human tissue correlate with the stage of some diseases. In the near infrared range the scattering event with the coefficient approximately two orders of magnitude greater than absorption plays a dominant role. When measuring the optical parameters variations were discovered that correlate with the rheumatoid arthritis of a small joint. The potential of an experimental setup for transillumination the finger joint with a laser diode and the pattern of the stray light detection are demonstrated. The scattering caused by skin contains no useful information and it can be removed by a deconvolution technique to enhance the diagnostic value of this non-invasive optical method. Monte Carlo simulations ensure both the construction of the corresponding point spread function and both the theoretical verification of the stray light picture in rather complex geometry.
e+e- event generator EPOCS user's manual
EPOCS(Electron POsitron Collision Simulator) is a Monte-Carlo event generator for high energy e+e- annihilation. This program generates events based on the standard model, i.e., quantum chromodynamics (QCD) and electro-weak theory. It works at the center-of-mass energy below W+W- production, i.e., in the energy region of TRISTAN, SLC and LEP. For these high energy machines one of the important subjects is the exploration for the top quark. The production and hadronization of the top quark is included in EPOCS. Besides the top quark, we expect 'new' physics in this high energy region. EPOCS has enough flexibility for users to cope with a new idea. Users can register a new particle, modify the built-in particle data, define new primary interactions and so on. The event generator has a number of parameters, both physical parameters and control parameters. Users can control most of these parameters in EPOCS at will. (author)
CMS Underlying Event and Double Parton Scattering Monte Carlo Tunes
Sunar Cerci, Deniz
2016-01-01
Three new PYTHIA 8 underlying event (UE) tunes are presented, one using the CTEQ6L1 parton distribution function (PDF), one using HERAPDF1.5LO, and one using the NNPDF2.3LO PDF; two new PYTHIA 6 UE tunes, one for the CTEQ6L1 PDF and one for the HERAPDF1.5LO, and one new HERWIG++ UE tune for the CTEQ6L1 PDF are also available. Simultaneous fits to CDF UE data at center-of-mass energies of 0.3, 0.9, and 1.96 TeV, together with CMS UE data at 7 TeV, check the UE models and constrain their parameters, providing thereby more precise predictions for proton-proton collisions at 13 TeV. In addition, several new double parton scattering (DPS) tunes are examined in order to investigate if the values of the parameters from fits to UE observables are consistent with the values determined from fitting DPS-sensitive observables. It is also examined how well the new UE tunes predict minimum bias (MB) events, jet, and Drell-Yan observables, as well as MB and UE observables at 13 TeV.
The GENIE Neutrino Monte Carlo Generator: Physics and User Manual
Andreopoulos, Costas [Univ. of Liverpool (United Kingdom). Dept. of Physics; Science and Technology Facilities Council (STFC), Oxford (United Kingdom). Rutherford Appleton Lab. (RAL). Particle Physics Dept.; Barry, Christopher [Univ. of Liverpool (United Kingdom). Dept. of Physics; Dytman, Steve [Univ. of Pittsburgh, PA (United States). Dept. of Physics and Astronomy; Gallagher, Hugh [Tufts Univ., Medford, MA (United States). Dept. of Physics and Astronomy; Golan, Tomasz [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Univ. of Rochester, NY (United States). Dept. of Physics and Astronomy; Hatcher, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Perdue, Gabriel [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Yarba, Julia [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)
2015-10-20
GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of its physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.
The GENIE Neutrino Monte Carlo Generator: Physics and User Manual
Andreopoulos, Costas; Dytman, Steve; Gallagher, Hugh; Golan, Tomasz; Hatcher, Robert; Perdue, Gabriel; Yarba, Julia
2015-01-01
GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of its physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.
Comparisons of neutrino event generators from an oscillation-experiment perspective
Mayer, Nathan [nathan.mayer@tufts.edu, Fermilab P.O. Box 500 M.S. 220, Batavia IL 60510 (United States)
2015-05-15
Monte Carlo generators are crucial to the analysis of high energy physics data, ideally giving a baseline comparison between the state-of-art theoretical models and experimental data. Presented here is a comparison between three of final state distributions from the GENIE, Neut, NUANCE, and NuWro neutrino Monte Carlo event generators. The final state distributions chosen for comparison are: the electromagnetic energy fraction in neutral current interactions, the energy of the leading π{sup 0} vs. the scattering angle for neutral current interactions, and the muon energy vs. scattering angle of ν{sub µ} charged current interactions.
Cluster-Event Biasing in Monte Carlo Applications to Systems Reliability
Estimation of the probabilities of rare events with significant consequences, e.g., disasters, is one of the most difficult problems in Monte Carlo applications to systems engineering and reliability. The Bernoulli-type estimator used in analog Monte Carlo is characterized by extremely high variance when applied to the estimation of rare events. Variance reduction methods are, therefore, of importance in this field.The present work suggests a parametric nonanalog probability measure based on the superposition of transition biasing and forced events biasing. The cluster-event model is developed providing an effective and reliable approximation for the second moment and the benefit along with a methodology of selecting near-optimal biasing parameters. Numerical examples show a considerable benefit when the method is applied to problems of particular difficulty for the analog Monte Carlo method.The suggested model is applicable for reliability assessment of stochastic networks of complicated topology and high redundancy with component-level repair (i.e., repair applied to an individual failed component while the system is operational)
Monte-Carlo event generation for the ep collider
In the present note, an attempt has been made to construct two models which might be expected to correspond to possible extremes of the configuration of the experimental data. The reality of the e p collider might be expected to lie between these two models. (orig.)
On the Self-Consistent Event Biasing Schemes for Monte Carlo Simulations of Nanoscale MOSFETs
Islam, Sharnali; Ahmed, Shaikh
2009-01-01
Different techniques of event biasing have been implemented in the particle-based Monte Carlo simulations of a 15nm n-channel MOSFET. The primary goal is to achieve enhancement in the channel statistics and faster convergence in the calculation of terminal current. Enhancement algorithms are especially useful when the device behavior is governed by rare events in the carrier transport process. After presenting a brief overview on the Monte Carlo technique for solving the Boltzmann transport equation, the basic steps of deriving the approach in presence of both the initial and the boundary conditions have been discussed. In the derivation, the linearity of the transport problem has been utilized first, where Coulomb forces between the carriers are initially neglected. The generalization of the approach for Hartree carriers has been established in the iterative procedure of coupling with the Poisson equation. It is shown that the weight of the particles, as obtained by biasing of the Boltzmann equation, survive...
QCDINS 2.0 A Monte Carlo generator for instanton-induced processes in deep-inelastic scattering
Ringwald, Andreas
2000-01-01
We describe a Monte Carlo event generator for the simulation of QCD-instanton induced processes in deep-inelastic scattering (HERA). The QCDINS package is designed as an ``add-on'' hard process generator interfaced to the general hadronic event simulation package HERWIG. It incorporates the theoretically predicted production rate for instanton-induced events as well as the essential characteristics that have been derived theoretically for the partonic final state of instanton-induced processes: notably, the flavour democratic and isotropic production of the partonic final state, energy weight factors different for gluons and quarks, and a high average multiplicity O(10) of produced partons with a Poisson distribution of the gluon multiplicity. While the subsequent perturbative evolution of the generated partons is always handled by the HERWIG package, the final hadronization step may optionally be performed also by means of the general hadronic event simulation package JETSET.
QCDINS 2.0 - A Monte Carlo generator for instanton-induced processes in deep-inelastic scattering
Ringwald, A.; Schrempp, F.
2000-11-01
We describe a Monte Carlo event generator for the simulation of QCD-instanton induced processes in deep-inelastic scattering (HERA). The QCDINS package is designed as an "add-on" hard process generator interfaced to the general hadronic event simulation package HERWIG. It incorporates the theoretically predicted production rate for instanton-induced events as well as the essential characteristics that have been derived theoretically for the partonic final state of instanton-induced processes: notably, the flavor democratic and isotropic production of the partonic final state, energy weight factors different for gluons and quarks, and a high average multiplicity O(10) of produced partons with a Poisson distribution of the gluon multiplicity. While the subsequent perturbative evolution of the generated partons is always handled by the HERWIG package, the final hadronization step may optionally be performed also by means of the general hadronic event simulation package JETSET.
Event generator tunes obtained from underlying event and multiparton scattering measurements
Khachatryan, Vardan; Tumasyan, Armen; Adam, Wolfgang; Aşılar, Ece; Bergauer, Thomas; Brandstetter, Johannes; Brondolin, Erica; Dragicevic, Marko; Erö, Janos; Flechl, Martin; Friedl, Markus; Fruehwirth, Rudolf; Ghete, Vasile Mihai; Hartl, Christian; Hörmann, Natascha; Hrubec, Josef; Jeitler, Manfred; Knünz, Valentin; König, Axel; Krammer, Manfred; Krätschmer, Ilse; Liko, Dietrich; Matsushita, Takashi; Mikulec, Ivan; Rabady, Dinyar; Rahbaran, Babak; Rohringer, Herbert; Schieck, Jochen; Schöfbeck, Robert; Strauss, Josef; Treberer-Treberspurg, Wolfgang; Waltenberger, Wolfgang; Wulz, Claudia-Elisabeth; Mossolov, Vladimir; Shumeiko, Nikolai; Suarez Gonzalez, Juan; Alderweireldt, Sara; Cornelis, Tom; De Wolf, Eddi A; Janssen, Xavier; Knutsson, Albert; Lauwers, Jasper; Luyckx, Sten; Van De Klundert, Merijn; Van Haevermaet, Hans; Van Mechelen, Pierre; Van Remortel, Nick; Van Spilbeeck, Alex; Abu Zeid, Shimaa; Blekman, Freya; D'Hondt, Jorgen; Daci, Nadir; De Bruyn, Isabelle; Deroover, Kevin; Heracleous, Natalie; Keaveney, James; Lowette, Steven; Moreels, Lieselotte; Olbrechts, Annik; Python, Quentin; Strom, Derek; Tavernier, Stefaan; Van Doninck, Walter; Van Mulders, Petra; Van Onsem, Gerrit Patrick; Van Parijs, Isis; Barria, Patrizia; Brun, Hugues; Caillol, Cécile; Clerbaux, Barbara; De Lentdecker, Gilles; Fasanella, Giuseppe; Favart, Laurent; Grebenyuk, Anastasia; Karapostoli, Georgia; Lenzi, Thomas; Léonard, Alexandre; Maerschalk, Thierry; Marinov, Andrey; Perniè, Luca; Randle-conde, Aidan; Seva, Tomislav; Vander Velde, Catherine; Vanlaer, Pascal; Yonamine, Ryo; Zenoni, Florian; Zhang, Fengwangdong; Beernaert, Kelly; Benucci, Leonardo; Cimmino, Anna; Crucy, Shannon; Dobur, Didar; Fagot, Alexis; Garcia, Guillaume; Gul, Muhammad; Mccartin, Joseph; Ocampo Rios, Alberto Andres; Poyraz, Deniz; Ryckbosch, Dirk; Salva Diblen, Sinem; Sigamani, Michael; Tytgat, Michael; Van Driessche, Ward; Yazgan, Efe; Zaganidis, Nicolas; Basegmez, Suzan; Beluffi, Camille; Bondu, Olivier; Brochet, Sébastien; Bruno, Giacomo; Caudron, Adrien; Ceard, Ludivine; Da Silveira, Gustavo Gil; Delaere, Christophe; Favart, Denis; Forthomme, Laurent; Giammanco, Andrea; Hollar, Jonathan; Jafari, Abideh; Jez, Pavel; Komm, Matthias; Lemaitre, Vincent; Mertens, Alexandre; Musich, Marco; Nuttens, Claude; Perrini, Lucia; Pin, Arnaud; Piotrzkowski, Krzysztof; Popov, Andrey; Quertenmont, Loic; Selvaggi, Michele; Vidal Marono, Miguel; Beliy, Nikita; Hammad, Gregory Habib; Aldá Júnior, Walter Luiz; Alves, Fábio Lúcio; Alves, Gilvan; Brito, Lucas; Correa Martins Junior, Marcos; Hamer, Matthias; Hensel, Carsten; Moraes, Arthur; Pol, Maria Elena; Rebello Teles, Patricia; Belchior Batista Das Chagas, Ewerton; Carvalho, Wagner; Chinellato, Jose; Custódio, Analu; Da Costa, Eliza Melo; De Jesus Damiao, Dilson; De Oliveira Martins, Carley; Fonseca De Souza, Sandro; Huertas Guativa, Lina Milena; Malbouisson, Helena; Matos Figueiredo, Diego; Mora Herrera, Clemencia; Mundim, Luiz; Nogima, Helio; Prado Da Silva, Wanda Lucia; Santoro, Alberto; Sznajder, Andre; Tonelli Manganote, Edmilson José; Vilela Pereira, Antonio; Ahuja, Sudha; Bernardes, Cesar Augusto; De Souza Santos, Angelo; Dogra, Sunil; Tomei, Thiago; De Moraes Gregores, Eduardo; Mercadante, Pedro G; Moon, Chang-Seong; Novaes, Sergio F; Padula, Sandra; Romero Abad, David; Ruiz Vargas, José Cupertino; Aleksandrov, Aleksandar; Hadjiiska, Roumyana; Iaydjiev, Plamen; Rodozov, Mircho; Stoykova, Stefka; Sultanov, Georgi; Vutova, Mariana; Dimitrov, Anton; Glushkov, Ivan; Litov, Leander; Pavlov, Borislav; Petkov, Peicho; Ahmad, Muhammad; Bian, Jian-Guo; Chen, Guo-Ming; Chen, He-Sheng; Chen, Mingshui; Cheng, Tongguang; Du, Ran; Jiang, Chun-Hua; Plestina, Roko; Romeo, Francesco; Shaheen, Sarmad Masood; Spiezia, Aniello; Tao, Junquan; Wang, Chunjie; Wang, Zheng; Zhang, Huaqiao; Asawatangtrakuldee, Chayanit; Ban, Yong; Li, Qiang; Liu, Shuai; Mao, Yajun; Qian, Si-Jin; Wang, Dayong; Xu, Zijun; Avila, Carlos; Cabrera, Andrés; Chaparro Sierra, Luisa Fernanda; Florez, Carlos; Gomez, Juan Pablo; Gomez Moreno, Bernardo; Sanabria, Juan Carlos; Godinovic, Nikola; Lelas, Damir; Puljak, Ivica; Ribeiro Cipriano, Pedro M; Antunovic, Zeljko; Kovac, Marko; Brigljevic, Vuko; Kadija, Kreso; Luetic, Jelena; Micanovic, Sasa; Sudic, Lucija; Attikis, Alexandros; Mavromanolakis, Georgios; Mousa, Jehad; Nicolaou, Charalambos; Ptochos, Fotios; Razis, Panos A; Rykaczewski, Hans; Bodlak, Martin; Finger, Miroslav; Finger Jr, Michael; Abdelalim, Ahmed Ali; Awad, Adel; Mahrous, Ayman; Mohammed, Yasser; Radi, Amr; Calpas, Betty
2016-01-01
New sets of parameters (``tunes'') for the underlying-event (UE) modeling of the PYTHIA8, PYTHIA6 and HERWIG++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE data at $\\sqrt{s} =$ 7 TeV and to UE data from the CDF experiment at lower $\\sqrt{s}$, are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton-proton collisions at 13 TeV. In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons of the UE tunes to ``minimum bias'' (MB) events, multijet, and Drell--Yan ($ \\mathrm{ q \\bar{q} } \\rightarrow \\mathrm{Z} / \\gamma^* \\rightarrow$ lepton-antilepton + jets) observables at 7 and 8 TeV are presented, as well as predictions of MB and UE observables at 13 TeV.
Event generator tunes obtained from underlying event and multiparton scattering measurements
Khachatryan, Vardan; et al.
2016-03-17
New sets of parameters (“tunes”) for the underlying-event (UE) modelling of the pythia8, pythia6 and herwig++ Monte Carlo event generators are constructed using different parton distribution functions. Combined fits to CMS UE proton–proton ( $\\mathrm {p}\\mathrm {p}$ ) data at $\\sqrt{s} = 7\\,\\text {TeV} $ and to UE proton–antiproton ( $\\mathrm {p}\\overline{\\mathrm{p}} $ ) data from the CDF experiment at lower $\\sqrt{s}$ , are used to study the UE models and constrain their parameters, providing thereby improved predictions for proton–proton collisions at 13 $\\,\\text {TeV}$ . In addition, it is investigated whether the values of the parameters obtained from fits to UE observables are consistent with the values determined from fitting observables sensitive to double-parton scattering processes. Finally, comparisons are presented of the UE tunes to “minimum bias” (MB) events, multijet, and Drell–Yan ( $ \\mathrm{q} \\overline{\\mathrm{q}} \\rightarrow \\mathrm{Z}/ \\gamma ^* \\rightarrow $ lepton-antilepton+jets) observables at 7 and 8 $\\,\\text {TeV}$ , as well as predictions for MB and UE observables at 13 $\\,\\text {TeV}$ .
An improved process event log artificial negative event generator.
vanden Broucke, Seppe; De Weerdt, Jochen; Vanthienen, Jan; Baesens, Bart
2012-01-01
Process mining is the research area that is concerned with knowledge discovery from event logs and is often situated at the intersection of the fields of data mining and business process management. Although the term entails a collection of a-posteriori analysis methods for extracting knowledge from event logs, most of the attention in the process mining literature has been given to process discovery techniques, focusing specifically on the extraction of control-flow models from event logs. P...
Random number plays an important role in any Monte Carlo simulation. The accuracy of the results depends on the quality of the sequence of random numbers employed in the simulation. These include randomness of the random numbers, uniformity of their distribution, absence of correlation and long period. In a typical Monte Carlo simulation of particle transport in a nuclear reactor core, the history of a particle from its birth in a fission event until its death by an absorption or leakage event is tracked. The geometry of the core and the surrounding materials are exactly modeled in the simulation. To track a neutron history one needs random numbers for determining inter collision distance, nature of the collision, the direction of the scattered neutron etc. Neutrons are tracked in batches. In one batch approximately 2000-5000 neutrons are tracked. The statistical accuracy of the results of the simulation depends on the total number of particles (number of particles in one batch multiplied by the number of batches) tracked. The number of histories to be generated is usually large for a typical radiation transport problem. To track a very large number of histories one needs to generate a long sequence of independent random numbers. In other words the cycle length of the random number generator (RNG) should be more than the total number of random numbers required for simulating the given transport problem. The number of bits of the machine generally limits the cycle length. For a binary machine of p bits the maximum cycle length is 2p. To achieve higher cycle length in the same machine one has to use either register arithmetic or bit manipulation technique
Tomášik, Boris
2009-09-01
within the statistical approach. Momentum spectra integrated over many events can be interpreted as produced from an expanding and locally thermalised fireball. The present Monte Carlo model unifies these approaches: fireball decays into fragments of some characteristic size. The fragments recede from each other as given by the pre-existing expansion of the fireball. They subsequently emit stable and unstable hadrons with momenta generated according to thermal distribution. Resonances then decay and their daughters acquire momenta as dictated by decay kinematics. Solution method: The Monte Carlo generator repeats a loop in which it generates individual events. First, sizes of fragments are generated. Then the fragments are placed within the decaying fireball and their velocities are determined from the one-to-one correspondence between the position and the expansion velocity in the blast wave model. Since hadrons may be emitted from fragments as well as from the remaining bulk fireball, first those from the bulk are generated according to the blast wave model. Then, hadron production from the fragments is treated. Each hadron is generated in the rest frame of the fragment and then boosted to the global frame. Finally, after all directly produced hadrons are generated, resonance decay channels are chosen and the momenta and positions of final state hadrons are determined. Running time: Generation of 100 events can take anything between 2 hours to a couple of days. This depends mainly on the size and density of fragments. Simulations with small fragments may be very slow. At the beginning of a run there is a period of up to 1 hour in which the program calculates thermal weights due to statistical model. This period is long if many species are included in the simulation.
郑川; 徐瑚珊; 欧阳珍; 袁小华; 王建松
2011-01-01
Hadron Physics Lanzhou Spectrometer (HPLUS) is designed for the study of hadron production and decay from nucleon-nucleon interaction in the GeV region. The current formation of HPLUS and the particle identification methods for three polar angle regions are discussed. The Pluto event generator is applied to simulate the primary reactions on HPLUS, concerning four issues as followed: the agreement on pp elastic scattering angular distribution between Pluto samples and experimental data; the acceptance of charged K mesons in the strangeness production channels for the forward region of HPLUS; the dependence of the maximum energy of photons and the minimum vertex angle of two photons on the polar angle; the influence on the mass spectrum of excited states of nucleon with large resonant width from different reconstruction methods. It is proved that the Pluto event generator satisfies the requirements of Monte Carlo simulation for HPLUS.%为了在GeV能区开展核子核子相互作用中强子的产生和衰变的实验研究,将在HIRFL-CSR主环上建造一台兰州强子物理谱仪(HPLUS).首先结合HPLUS的初步构型阐明了针对不同极角区域的粒子鉴别方法,接着介绍了在蒙特卡罗模拟中使用的Pluto事件产生器,并针对HPLUS上主要的反应道模拟研究了:Pluto产生的质子质子弹性散射的角分布和实验数据的符合程度;在产生K介子的反应道中,HPLUS前角区对K介子的接收度;强子衰变产生光子的最大能量和两个光子的最小夹角随极角区域的变化情况;不同的重建方法对核子激发态质量谱形状的影响.以上研究说明了Pluto事件产生器可以满足HPLUS模拟的需要.
Automatic Monte-Carlo tuning for minimum bias events at the LHC
Kama, Sami
2010-06-22
The Large Hadron Collider near Geneva Switzerland will ultimately collide protons at a center-of-mass energy of 14 TeV and 40 MHz bunch crossing rate with a luminosity of L=10{sup 34} cm{sup -2}s{sup -1}. At each bunch crossing about 20 soft proton-proton interactions are expected to happen. In order to study new phenomena and improve our current knowledge of the physics these events must be understood. However, the physics of soft interactions are not completely known at such high energies. Different phenomenological models, trying to explain these interactions, are implemented in several Monte-Carlo (MC) programs such as PYTHIA, PHOJET and EPOS. Some parameters in such MC programs can be tuned to improve the agreement with the data. In this thesis a new method for tuning the MC programs, based on Genetic Algorithms and distributed analysis techniques have been presented. This method represents the first and fully automated MC tuning technique that is based on true MC distributions. It is an alternative to parametrization-based automatic tuning. This new method is used in finding new tunes for PYTHIA 6 and 8. These tunes are compared to the tunes found by alternative methods, such as the PROFESSOR framework and manual tuning, and found to be equivalent or better. Charged particle multiplicity, dN{sub ch}/d{eta}, Lorentz-invariant yield, transverse momentum and mean transverse momentum distributions at various center-of-mass energies are generated using default tunes of EPOS, PHOJET and the Genetic Algorithm tunes of PYTHIA 6 and 8. These distributions are compared to measurements from UA5, CDF, CMS and ATLAS in order to investigate the best model available. Their predictions for the ATLAS detector at LHC energies have been investigated both with generator level and full detector simulation studies. Comparison with the data did not favor any model implemented in the generators, but EPOS is found to describe investigated distributions better. New data from ATLAS and
Automatic Monte-Carlo tuning for minimum bias events at the LHC
The Large Hadron Collider near Geneva Switzerland will ultimately collide protons at a center-of-mass energy of 14 TeV and 40 MHz bunch crossing rate with a luminosity of L=1034 cm-2s-1. At each bunch crossing about 20 soft proton-proton interactions are expected to happen. In order to study new phenomena and improve our current knowledge of the physics these events must be understood. However, the physics of soft interactions are not completely known at such high energies. Different phenomenological models, trying to explain these interactions, are implemented in several Monte-Carlo (MC) programs such as PYTHIA, PHOJET and EPOS. Some parameters in such MC programs can be tuned to improve the agreement with the data. In this thesis a new method for tuning the MC programs, based on Genetic Algorithms and distributed analysis techniques have been presented. This method represents the first and fully automated MC tuning technique that is based on true MC distributions. It is an alternative to parametrization-based automatic tuning. This new method is used in finding new tunes for PYTHIA 6 and 8. These tunes are compared to the tunes found by alternative methods, such as the PROFESSOR framework and manual tuning, and found to be equivalent or better. Charged particle multiplicity, dNch/dη, Lorentz-invariant yield, transverse momentum and mean transverse momentum distributions at various center-of-mass energies are generated using default tunes of EPOS, PHOJET and the Genetic Algorithm tunes of PYTHIA 6 and 8. These distributions are compared to measurements from UA5, CDF, CMS and ATLAS in order to investigate the best model available. Their predictions for the ATLAS detector at LHC energies have been investigated both with generator level and full detector simulation studies. Comparison with the data did not favor any model implemented in the generators, but EPOS is found to describe investigated distributions better. New data from ATLAS and CMS show higher than
First applications of the HIPSE event generator
The predictions of an event generator, HIPSE (Heavy-Ion Phase-Space Exploration), dedicated to the description of nuclear collisions in the intermediate energy range, are compared with experimental data collected by the INDRA and INDRA-ALADIN collaborations. Special emphasis is put on the kinematical characteristics of fragments and light particles at all impact parameters for the system Xe+Sn between 25 and 80 MeV/u. Considering the kinematical characteristics of the fragments, we have shown that the collective motion finds its origin both in the intrinsic motion of the nucleon and in the relative momentum between the two partners of the reaction suggesting a fragmentation process with a strong memory of the entrance channel. Moreover, the model gives information on the phase space explored during the collision as for example pre-equilibrium emission. It also allows a direct access of the partition at freeze-out (in terms of excitation energy, angular momentum, impact parameter...) before secondary decay
Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project
Abbrescia, M; Aiola, S; Antolini, R; Avanzini, C; Baldini Ferroli, R; Bencivenni, G; Bossini, E; Bressan, E; Chiavassa, A; Cicalò, C; Cifarelli, L; Coccia, E; De Gruttola, D; De Pasquale, S; Di Giovanni, A; D'Incecco, M; Dreucci, M; Fabbri, F L; Frolov, V; Garbini, M; Gemme, G; Gnesi, I; Gustavino, C; Hatzifotiadou, D; La Rocca, P; Li, S; Librizzi, F; Maggiora, A; Massai, M; Miozzi, S; Panareo, M; Paoletti, R; Perasso, L; Pilo, F; Piragino, G; Regano, A; Riggi, F; Righini, G C; Sartorelli, G; Scapparone, E; Scribano, A; Selvi, M; Serci, S; Siddi, E; Spandre, G; Squarcia, S; Taiuti, M; Tosello, F; Votano, L; Williams, M C S; Yánez, G; Zichichi, A; Zuyeuski, R
2014-01-01
The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 10 5 km 2 . In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.
On the maximal use of Monte Carlo samples: re-weighting events at NLO accuracy
Mattelaer, Olivier
2016-01-01
Accurate Monte Carlo simulations for high-energy events at CERN's Large Hadron Collider, are very expensive, both from the computing and storage points of view. We describe a method that allows to consistently re-use parton-level samples accurate up to NLO in QCD under different theoretical hypotheses. We implement it in MadGraph5_aMC@NLO and show its validation by applying it to several cases of practical interest for the search of new physics at the LHC.
Monte Carlo analysis of the MEGA microlensing events towards M31
Ingrosso, G; De Paolis, F; Jetzer, P; Nucita, A A; Strafella, F; Jetzer, Ph.
2005-01-01
We perform an analytical study and a Monte Carlo (MC) analysis of the main features for microlensing events in pixel lensing observations towards M31. Our main aim is to investigate the lens nature and location of the 14 candidate events found by the MEGA collaboration. Assuming a reference model for the mass distribution in M31 and the standard model for our galaxy, we estimate the MACHO-to-self lensing probability and the event time duration towards M31. Reproducing the MEGA observing conditions, as a result we get the MC event number density distribution as a function of the event full-width half-maximum duration $t_{1/2}$ and the magnitude at maximum $R_{\\mathrm {max}}$. For a MACHO mass of $0.5 M_{\\odot}$ we find typical values of $t_{1/2} \\simeq 20$ day and $R_{\\mathrm {max}} \\simeq 22$, for both MACHO-lensing and self-lensing events occurring beyond about 10 arcminutes from the M31 center. A comparison of the observed features ($t_{1/2}$ and $R_{\\mathrm {max}}$) with our MC results shows that for a MAC...
A Stateful Approach to Generate Synthetic Events from Kernel Traces
Naser Ezzati-Jivan; Michel R. Dagenais
2012-01-01
We propose a generic synthetic event generator from kernel trace events. The proposed method makes use of patterns of system states and environment-independent semantic events rather than platform-specific raw events. This method can be applied to different kernel and user level trace formats. We use a state model to store intermediate states and events. This stateful method supports partial trace abstraction and enables users to seek and navigate through the trace events and to abstract out ...
A Monte Carlo Generator for High Energy Nucleus- Nucleus Collision
Hassan, N. M.; El-Harby, N.; Hussein, M. T.
1999-01-01
A Monte Carlo simulator is presented to reproduce data of nucleus-nucleus interactions at high energies. The program is designed in a microscopic point of view, where the cascade approach is applied. Moreover, each nucleon from both the target and the projectile is followed up on the time scale along the collision time. The effect of the mean field that depends on the nuclear density is considered. Elastic and inelastic scattering are allowed for the nucleon binary collisions during the casca...
Automated Testing with Targeted Event Sequence Generation
Jensen, Casper Svenning; Prasad, Mukul R.; Møller, Anders
2013-01-01
Automated software testing aims to detect errors by producing test inputs that cover as much of the application source code as possible. Applications for mobile devices are typically event-driven, which raises the challenge of automatically producing event sequences that result in high coverage....... Some existing approaches use random or model-based testing that largely treats the application as a black box. Other approaches use symbolic execution, either starting from the entry points of the applications or on specific event sequences. A common limitation of the existing approaches is that they...
EKHARA Monte Carlo generator for e+e- to e+e-pi0 and e+e- to e+e- pi+pi- processes
Czyz, Henryk; Ivashyn, Sergiy
2010-01-01
We present EKHARA Monte Carlo event generator of reactions e+e- to e+e- pi0 and e+e- to e+e- pi+pi-. The newly added channel (e+e- to e+e-pi0) is important for gamma*gamma* physics and can be used for the pion transition form factor studies at meson factories.
Brachytherapy structural shielding calculations using Monte Carlo generated, monoenergetic data
Purpose: To provide a method for calculating the transmission of any broad photon beam with a known energy spectrum in the range of 20–1090 keV, through concrete and lead, based on the superposition of corresponding monoenergetic data obtained from Monte Carlo simulation. Methods: MCNP5 was used to calculate broad photon beam transmission data through varying thickness of lead and concrete, for monoenergetic point sources of energy in the range pertinent to brachytherapy (20–1090 keV, in 10 keV intervals). The three parameter empirical model introduced byArcher et al. [“Diagnostic x-ray shielding design based on an empirical model of photon attenuation,” Health Phys. 44, 507–517 (1983)] was used to describe the transmission curve for each of the 216 energy-material combinations. These three parameters, and hence the transmission curve, for any polyenergetic spectrum can then be obtained by superposition along the lines of Kharrati et al. [“Monte Carlo simulation of x-ray buildup factors of lead and its applications in shielding of diagnostic x-ray facilities,” Med. Phys. 34, 1398–1404 (2007)]. A simple program, incorporating a graphical user interface, was developed to facilitate the superposition of monoenergetic data, the graphical and tabular display of broad photon beam transmission curves, and the calculation of material thickness required for a given transmission from these curves. Results: Polyenergetic broad photon beam transmission curves of this work, calculated from the superposition of monoenergetic data, are compared to corresponding results in the literature. A good agreement is observed with results in the literature obtained from Monte Carlo simulations for the photon spectra emitted from bare point sources of various radionuclides. Differences are observed with corresponding results in the literature for x-ray spectra at various tube potentials, mainly due to the different broad beam conditions or x-ray spectra assumed. Conclusions
Monte Carlo Simulation for Moderator of Compact D-T Neutron Generator
无
2011-01-01
In order to study the neutron moderation of D-T neutron generator, moderators with diffident materials and structures are predicted by Monte Carlo simulations. Neutron generator is simplified as the diameter 20 cm, length 25 cm cylinder. The target is very
Automatic Monte-Carlo Tuning for Minimum Bias Events at the LHC
Kama, Sami; Kolanoski, Hermann
The Large Hadron Collider near Geneva Switzerland will ultimately collide protons at a center-of-mass energy of $14\\tev$ and $40\\mhz$ bunch crossing rate with a luminosity of $\\lumi{10^{34}}$. At each bunch crossing about 20 soft proton-proton interactions are expected to happen. In order to study new phenomena and improve our current knowledge of the physics these events must be understood. However, the physics of soft interactions are not completely known at such high energies. Different phenomenological models, trying to explain these interactions, are implemented in several Monte-Carlo (MC) programs such as PYTHIA, PHOJET and EPOS. Some parameters in such MC programs can be tuned to improve the agreement with the data. In this thesis a new method for tuning the MC programs, based on Genetic Algorithms and distributed analysis techniques have been presented. This method represents the first and fully automated MC tuning technique that is based on true MC distributions. It ...
Monte Carlo study for the dynamical fluctuations inside a single jet in 2-jet events
The dynamical fluctuations inside a single jet in the 2-jet events produced in e+e- collisions at 91.2 GeV have been studied using Monte Carlo method. The results show that, the anisotropy of dynamical fluctuations inside a single jet changes remarkably with the variation of the cut parameter ycut. A transition point (γpt = γψ ≠γy) exists, where the dynamical fluctuations are anisotropic in the longitudinal-transverse plan and isotropic in the transverse planes. It indicates that the ycut corresponding to the transition point is a physically reasonable cutting parameter for selecting jets and, meanwhile, the relative transverse momentum kt at the transition point is the scale for the determination of physical jets. This conclusion is in good agreement with the experimental fact that the third jet (gluon jet) was historically first discovered in the energy region 17-30 GeV in e+e- collisions
MadGraph/MadEvent v4: The New Web Generation
Alwall, Johan; de Visscher, Simon; Frederix, Rikkert; Herquet, Michel; Maltoni, Fabio; Plehn, Tilman; Rainwater, David L; Stelzer, Tim
2007-01-01
We present the latest developments of the MadGraph/MadEvent Monte Carlo event generator and several applications to hadron collider physics. In the current version events at the parton, hadron and detector level can be generated directly from a web interface, for arbitrary processes in the Standard Model and in several physics scenarios beyond it (HEFT, MSSM, 2HDM). The most important additions are: a new framework for implementing user-defined new physics models; a standalone running mode for creating and testing matrix elements; generation of events corresponding to different processes, such as signal(s) and backgrounds, in the same run; two platforms for data analysis, where events are accessible at the parton, hadron and detector level; and the generation of inclusive multi-jet samples by combining parton-level events with parton showers. To illustrate the new capabilities of the package some applications to hadron collider physics are presented: 1) Higgs search in pp \\to H \\to W^+W^-: signal and backgrou...
An Event-Driven Hybrid Molecular Dynamics and Direct Simulation Monte Carlo Algorithm
Donev, A; Garcia, A L; Alder, B J
2007-07-30
A novel algorithm is developed for the simulation of polymer chains suspended in a solvent. The polymers are represented as chains of hard spheres tethered by square wells and interact with the solvent particles with hard core potentials. The algorithm uses event-driven molecular dynamics (MD) for the simulation of the polymer chain and the interactions between the chain beads and the surrounding solvent particles. The interactions between the solvent particles themselves are not treated deterministically as in event-driven algorithms, rather, the momentum and energy exchange in the solvent is determined stochastically using the Direct Simulation Monte Carlo (DSMC) method. The coupling between the solvent and the solute is consistently represented at the particle level, however, unlike full MD simulations of both the solvent and the solute, the spatial structure of the solvent is ignored. The algorithm is described in detail and applied to the study of the dynamics of a polymer chain tethered to a hard wall subjected to uniform shear. The algorithm closely reproduces full MD simulations with two orders of magnitude greater efficiency. Results do not confirm the existence of periodic (cycling) motion of the polymer chain.
A Monte Carlo study of the acceptance to scattered events in a depth encoding PET camera
We present a Monte Carlo study of acceptance to scattered events in a Depth Encoding Large Aperture Camera (DELAC), a hypothetical PET scanner with the capacity to encode the depth-of-interaction (DOI) of incident γ-rays. The simulation is initially validated against the measured energy resolution and scatter fraction of the ECAT-953B scanner. It is then used to assess the response to scattered events in a PET camera made of position encoding blocks of the EXACT HR PLUS type, modified to have DOI resolution through a variation in the photopeak pulse height. The detection efficiency for 511 keV γ-rays, as well as for those that scattered in the object or left only part of their energy in the block, is studied for several combinations of DOI sensitivities and block thicknesses. The scatter fraction predicted by the simulation for DELACs of various ring radii is compared to that of the ECAT-953B as a function of the energy threshold. The results indicate that the poorer discrimination of object scatters with depth sensitive blocks does not lead to a dramatic increase of the scatter fraction. (author). 10 refs., 1 tab., 5 figs
Study of variants for Monte Carlo generators of τ → 3πν decays
Was, Zbigniew; Zaremba, Jakub [PAN, Institute of Nuclear Physics, Krakow (Poland)
2015-11-15
Low energy QCD (below 2 GeV) is a region of resonance dynamics, sometimes lacking a satisfactory description as compared to the precision of available experimental data. Hadronic τ decays offer a probe for such an energy regime. In general, the predictions for decays are model dependent, with parameters fitted to experimental results. The parameterizations differ by the amount of assumptions and theoretical requirements taken into account. Both model distributions and acquired data samples used for the fits are the results of a complex effort. In this paper, we investigate the main parameterizations of τ decays. The differences in analytical forms of the currents and resulting distributions used for comparison with the experimental data are studied. We use invariant mass spectra of all possible pion pairs and the whole three-pion system. Also three-dimensional histograms spanned over all distinct squared invariant masses are used to represent the results of models and experimental data. We present distributions from TAUOLA Monte Carlo generation and a semi-analytical calculation. These are necessary steps in the development for fitting in an as model-independent way as possible, and to explore multi-million event experimental data samples. This includes the response of distributions to model variants, and/or numerical values of the parameters. The interference effects of the currents' parts are also studied. For technical purposes, weighted events are introduced. Even though we focus on 3πν{sub τ} modes, technical aspects of our study are relevant for all τ decay modes into three hadrons. (orig.)
Study of variants for Monte Carlo generators of τ→3πν decays
Wąs, Zbigniew; Zaremba, Jakub, E-mail: jakub.zaremba@ifj.edu.pl [Institute of Nuclear Physics, PAN, ul. Radzikowskiego 152, Kraków (Poland)
2015-11-28
Low energy QCD (below 2 GeV) is a region of resonance dynamics, sometimes lacking a satisfactory description as compared to the precision of available experimental data. Hadronic τ decays offer a probe for such an energy regime. In general, the predictions for decays are model dependent, with parameters fitted to experimental results. The parameterizations differ by the amount of assumptions and theoretical requirements taken into account. Both model distributions and acquired data samples used for the fits are the results of a complex effort. In this paper, we investigate the main parameterizations of τ decay matrix elements for the one- and three-prong channels of three-pion τ decays. The differences in analytical forms of the currents and resulting distributions used for comparison with the experimental data are studied. We use invariant mass spectra of all possible pion pairs and the whole three-pion system. Also three-dimensional histograms spanned over all distinct squared invariant masses are used to represent the results of models and experimental data. We present distributions from TAUOLA Monte Carlo generation and a semi-analytical calculation. These are necessary steps in the development for fitting in an as model-independent way as possible, and to explore multi-million event experimental data samples. This includes the response of distributions to model variants, and/or numerical values of the parameters. The interference effects of the currents’ parts are also studied. For technical purposes, weighted events are introduced. Even though we focus on 3πν{sub τ} modes, technical aspects of our study are relevant for all τ decay modes into three hadrons.
Simulation and event generation in high-energy physics
A basic introduction to the physics modeling, the event generation and the detector simulation as designed for the upcoming high-energy physics experiments is presented. Requirements on software developments and computing performances are stressed. (author)
Radiative corrections and Monte Carlo generators for physics at flavor factories
Montagna Guido
2016-01-01
Full Text Available I review the state of the art of precision calculations and related Monte Carlo generators used in physics at flavor factories. The review describes the tools relevant for the measurement of the hadron production cross section (via radiative return, energy scan and in γγ scattering, luminosity monitoring, searches for new physics and physics of the τ lepton.
Risk analysis and Monte Carlo simulation applied to the generation of drilling AFE estimates
This paper presents a method for developing an authorization-for-expenditure (AFE)-generating model and illustrates the technique with a specific offshore field development case study. The model combines Monte Carlo simulation and statistical analysis of historical drilling data to generate more accurate, risked, AFE estimates. In addition to the general method, two examples of making AFE time estimates for North Sea wells with the presented techniques are given
Introduction to GR@PPA event generators for pp/ppbar collisions
Tsuno, S
2005-01-01
We have developed an extended framework, named GR@PPA, of the GRACE system for hadron collisions. The GRACE system is an automatic Feynman diagram calculation system and an event generator based on this diagram calculation. While the original GRACE system assumes that both the initial and final states are well-defined, the GR@PPA framework applies that the initial and final states parton configuration is treated in the Feynman diagram calculation at the same time by putting one more integration variables. As a result, some subprocesses with the same coupling order in hadron-hadron collisions can share an identical "GRACE output code" and can be treated as a single subprocess. This technique simplifies the program code and saves the computing time very much. The constructed event generators would be suitable for the large scale Monte Carlo production in the hadron colliders. In this paper, we discuss this technique, and present some results and performances.
Monte Carlo simulations of a D-T neutron generator shielding for landmine detection
Reda, A.M., E-mail: amreda2005@yahoo.com [College of Science, Shaqra University, Al-Dawadme, P.O. Box 1040 (Saudi Arabia)
2011-10-15
Shielding for a D-T sealed neutron generator has been designed using the MCNP5 Monte Carlo radiation transport code. The neutron generator will be used in field for the detection of explosives, landmines, drugs and other 'threat' materials. The optimization of the detection of buried objects was started by studying the signal-to-noise ratio for different geometric conditions. - Highlights: > A landmine detection system based on neutron fast/slow analysis has been designed. > Shielding for a D-T sealed neutron generator tube has been designed using Monte Carlo radiation transport code. > Detection of buried objects was started by studying the signal-to-noise ratio for different geometric conditions. > The signal-to-background ratio optimized at one position for all depths.
Efficient Monte Carlo simulations using a shuffled nested Weyl sequence random number generator.
Tretiakov, K V; Wojciechowski, K W
1999-12-01
The pseudorandom number generator proposed recently by Holian et al. [B. L. Holian, O. E. Percus, T. T. Warnock, and P. A. Whitlock, Phys. Rev. E 50, 1607 (1994)] is tested via Monte Carlo computation of the free energy difference between the defectless hcp and fcc hard sphere crystals by the Frenkel-Ladd method [D. Frenkel and A. J. C. Ladd, J. Chem. Phys. 81, 3188 (1984)]. It is shown that this fast and convenient for parallel computing generator gives results in good agreement with results obtained by other generators. An estimate of high accuracy is obtained for the hcp-fcc free energy difference near melting. PMID:11970727
Event generation for next to leading order chargino production at the international linear collider
At the International Linear Collider (ILC), parameters of supersymmetry (SUSY) can be determined with an experimental accuracy matching the precision of next-to-leading order (NLO) and higher-order theoretical predictions. Therefore, these contributions need to be included in the analysis of the parameters. We present a Monte-Carlo event generator for simulating chargino pair production at the ILC at next-to-leading order in the electroweak couplings. We consider two approaches of including photon radiation. A strict fixed-order approach allows for comparison and consistency checks with published semianalytic results in the literature. A version with soft- and hard-collinear resummation of photon radiation, which combines photon resummation with the inclusion of the NLO matrix element for the production process, avoids negative event weights, so the program can simulate physical (unweighted) event samples. Photons are explicitly generated throughout the range where they can be experimentally resolved. In addition, it includes further higher-order corrections unaccounted for by the fixed-order method. Inspecting the dependence on the cutoffs separating the soft and collinear regions, we evaluate the systematic errors due to soft and collinear approximations for NLO and higher-order contributions. In the resummation approach, the residual uncertainty can be brought down to the per-mil level, coinciding with the expected statistical uncertainty at the ILC. We closely investigate the two-photon phase space for the resummation method. We present results for cross sections and event generation for both approaches. (orig.)
Event generation for next to leading order chargino production at the international linear collider
Robens, T.
2006-10-15
At the International Linear Collider (ILC), parameters of supersymmetry (SUSY) can be determined with an experimental accuracy matching the precision of next-to-leading order (NLO) and higher-order theoretical predictions. Therefore, these contributions need to be included in the analysis of the parameters. We present a Monte-Carlo event generator for simulating chargino pair production at the ILC at next-to-leading order in the electroweak couplings. We consider two approaches of including photon radiation. A strict fixed-order approach allows for comparison and consistency checks with published semianalytic results in the literature. A version with soft- and hard-collinear resummation of photon radiation, which combines photon resummation with the inclusion of the NLO matrix element for the production process, avoids negative event weights, so the program can simulate physical (unweighted) event samples. Photons are explicitly generated throughout the range where they can be experimentally resolved. In addition, it includes further higher-order corrections unaccounted for by the fixed-order method. Inspecting the dependence on the cutoffs separating the soft and collinear regions, we evaluate the systematic errors due to soft and collinear approximations for NLO and higher-order contributions. In the resummation approach, the residual uncertainty can be brought down to the per-mil level, coinciding with the expected statistical uncertainty at the ILC. We closely investigate the two-photon phase space for the resummation method. We present results for cross sections and event generation for both approaches. (orig.)
Dorval, Eric
2016-01-01
Neutron transport calculations by Monte Carlo methods are finding increased application in nuclear reactor simulations. In particular, a versatile approach entails the use of a 2-step pro-cedure, with Monte Carlo as a few-group cross section data generator at lattice level, followed by deterministic multi-group diffusion calculations at core level. In this thesis, the Serpent 2 Monte Carlo reactor physics burnup calculation code is used in order to test a set of diffusion coefficient model...
Analysing the statistics of group constants generated by Serpent 2 Monte Carlo code
An important topic in Monte Carlo neutron transport calculations is to verify that the statistics of the calculated estimates are correct. Undersampling, non-converged fission source distribution and inter-cycle correlations may result in inaccurate results. In this paper, we study the effect of the number of neutron histories on the distributions of homogenized group constants and assembly discontinuity factors generated using Serpent 2 Monte Carlo code. We apply two normality tests and a so-called “drift-in-mean” test to the batch-wise distributions of selected parameters generated for two assembly types taken from the MIT BEAVRS benchmark. The results imply that in the tested cases the batch-wise estimates of the studied group constants can be regarded as normally distributed. We also show that undersampling is an issue with the calculated assembly discontinuity factors when the number of neutron histories is small. (author)
Monte Carlo studies on the hadronic calibration of the H1 calorimeter with HERA events
Two different methods to calibrate the H1 calorimeter with hadrons from HERA events are investigated. For these studies the LEPTO/JETSET event generator and the fast H1 detector simulation program P.S.I. were used. Isolated particles, measured and reconstructed with the track chambers, may cause isolated showers within the calorimeter. The measured momenta of hadrons (up to about 20 GeV/c) can be compared with the measured energy in the calorimeter. The influence of neutral particles and of neighbouring showers on the energy deposition is discussed. It is shown that a calibration is possible by comparing the transverse momentum of the scattered electron and of secondary hadrons. Disturbing effects on this measurement (e.g. energy losses in the beamhole) are presented. In both cases the number of events with Q2>10 GeV2 corresponding to 1 pb-1 is found to be sufficient to apply the mentioned methods for a global calibration. (orig./HSI)
Jirauschek, Christian; Okeil, Hesham; Lugli, Paolo
2015-01-26
Based on self-consistent ensemble Monte Carlo simulations coupled to the optical field dynamics, we investigate the giant nonlinear susceptibility giving rise to terahertz difference frequency generation in quantum cascade laser structures. Specifically, the dependence on temperature, bias voltage and frequency is considered. It is shown that the optical nonlinearity is temperature insensitive and covers a broad spectral range, as required for widely tunable room temperature terahertz sources. The obtained results are consistent with available experimental data. PMID:25835923
Herwig: The Evolution of a Monte Carlo Simulation
CERN. Geneva
2015-01-01
Monte Carlo event generation has seen significant developments in the last 10 years starting with preparation for the LHC and then during the first LHC run. I will discuss the basic ideas behind Monte Carlo event generators and then go on to discuss these developments, focussing on the developments in Herwig(++) event generator. I will conclude by presenting the current status of event generation together with some results of the forthcoming new version of Herwig, Herwig 7.
Robles Pimentel, Edgar [Instituto de Investigaciones Electricas, Cuernavaca (Mexico); Garcia Hernandez, Javier [Comision Federal de Electricidad, Mexico, D. F. (Mexico)
1997-12-31
In November 1995, the failure of the Unit 2 generator at the hydroelectric central station Ingeniero Carlos Ramirez Ulloa, El Caracol, occurred. The accident forced to carry out its overhaul. Here are presented the technical problems faced during the overhaul of the generator and analyzed the implemented solutions. [Espanol] En noviembre de 1995 ocurrio la falla del generador de la unidad 2 de la central hidroelectrica Ing. Carlos Ramirez Ulloa, El Caracol. El accidente obligo a llevar a cabo su rehabilitacion. Se presentan los problemas tecnicos enfrentados durante la rehabilitacion del generador y se discuten las soluciones implementadas.
Absolute GPS Time Event Generation and Capture for Remote Locations
HIRES Collaboration
The HiRes experiment operates fixed location and portable lasers at remote desert locations to generate calibration events. One physics goal of HiRes is to search for unusual showers. These may appear similar to upward or horizontally pointing laser tracks used for atmospheric calibration. It is therefore necessary to remove all of these calibration events from the HiRes detector data stream in a physics blind manner. A robust and convenient "tagging" method is to generate the calibration events at precisely known times. To facilitate this tagging method we have developed the GPSY (Global Positioning System YAG) module. It uses a GPS receiver, an embedded processor and additional timing logic to generate laser triggers at arbitrary programmed times and frequencies with better than 100nS accuracy. The GPSY module has two trigger outputs (one microsecond resolution) to trigger the laser flash-lamp and Q-switch and one event capture input (25nS resolution). The GPSY module can be programmed either by a front panel menu based interface or by a host computer via an RS232 serial interface. The latter also allows for computer logging of generated and captured event times. Details of the design and the implementation of these devices will be presented. 1 Motivation Air Showers represent a small fraction, much less than a percent, of the total High Resolution Fly's Eye data sample. The bulk of the sample is calibration data. Most of this calibration data is generated by two types of systems that use lasers. One type sends light directly to the detectors via optical fibers to monitor detector gains (Girard 2001). The other sends a beam of light into the sky and the scattered light that reaches the detectors is used to monitor atmospheric effects (Wiencke 1998). It is important that these calibration events be cleanly separated from the rest of the sample both to provide a complete set of monitoring information, and more
EVENT GENERATOR FOR RHIC SPIN PHYSICS-VOLUME 11
SAITO,N.; SCHAEFER,A.
1998-12-01
This volume contains the report of the RIKEN BNL Research Center workshop on ''Event Generator for RHIC Spin Physics'' held on September 21-23, 1998 at Brookhaven National Laboratory. A major objective of the workshop was to establish a firm collaboration to develop suitable event generators for the spin physics program at RHIC. With the completion of the Relativistic Heavy Ion Collider (RHIC) as a polarized collider a completely new domain of high-energy spin physics will be opened. The planned studies address the spin structure of the nucleon, tests of the standard model, and transverse spin effects in initial and final states. RHIC offers the unique opportunity to pursue these studies because of its high and variable energy, 50 {le} {radical}s {le} 500 GeV, high polarization, 70%, and high luminosity, 2 x 10{sup 32} cm{sup -2} sec{sup -1} or more at 500 GeV. To maximize the output from the spin program at RHIC, the understanding of both experimental and theoretical systematic errors is crucial. It will require full-fledged event generators, to simulate the processes of interest in great detail. The history of event generators shows that their development and improvement are ongoing processes taking place in parallel to the physics analysis by various experimental groups. The number of processes included in the generators has been increasing and the precision of their predictions has been being improved continuously. Our workshop aims at getting this process well under way for the spin physics program at RHIC, based on the fist development in this direction, SPHINX. The scope of the work includes: (1) update of the currently existing event generator by including the most recent parton parameterizations as a library and reflecting recent progress made for spin-independent generators, (2) implementation of new processes, especially parity violating effects in high energy pp collisions, (3) test of the currently available event generator by
Bottom-quark fragmentation: Comparing results from tuned event generators and resummed calculations
We study bottom-quark fragmentation in e+e- annihilation, top and Higgs decay H->bb-bar , using Monte Carlo event generators, as well as calculations, based on the formalism of perturbative fragmentation functions, which resum soft- and collinear-radiation effects in the next-to-leading logarithmic approximation. We consider the PYTHIA and HERWIG generators, and implement matrix-element corrections to the parton shower simulation of the H->bb-bar process in HERWIG. We tune the Kartvelishvili, string and cluster models to B-hadron data from LEP and SLD, and present results in both xB and moment spaces. The B-hadron spectra yielded by HERWIG, PYTHIA and resummed calculations show small discrepancies, which are due to the different approaches and models employed and to the quality of the fits to the e+e- data
Bottom-quark fragmentation: Comparing results from tuned event generators and resummed calculations
Corcella, G. [Department of Physics, CERN, Theory Division, CH-1211 Geneva 23 (Switzerland)]. E-mail: gennaro.corcella@cern.ch; Drollinger, V. [Dipartimento di Fisica Galileo Galilei, Universita di Padova, and INFN, Sezione di Padova, Via Marzolo 8, I-35131 Padova (Italy)]. E-mail: volker.drollinger@cern.ch
2005-12-05
We study bottom-quark fragmentation in e{sup +}e{sup -} annihilation, top and Higgs decay H->bb-bar , using Monte Carlo event generators, as well as calculations, based on the formalism of perturbative fragmentation functions, which resum soft- and collinear-radiation effects in the next-to-leading logarithmic approximation. We consider the PYTHIA and HERWIG generators, and implement matrix-element corrections to the parton shower simulation of the H->bb-bar process in HERWIG. We tune the Kartvelishvili, string and cluster models to B-hadron data from LEP and SLD, and present results in both x{sub B} and moment spaces. The B-hadron spectra yielded by HERWIG, PYTHIA and resummed calculations show small discrepancies, which are due to the different approaches and models employed and to the quality of the fits to the e{sup +}e{sup -} data.
Monte Carlo simulation of gamma-ray transport for the purpose of performing elemental analysis of bulk samples requires the tracking of gamma rays in the sample and also in the detector(s) used. Detector response functions (DRF's) are an efficient and accurate variance reduction technique that greatly decreases the simulation time by substituting the tracking of gamma rays inside the detector by predefined single energy gamma-ray spectra. These spectra correspond to the average response of the detector for incident gamma rays. DRF's are generated by Monte Carlo methods and are benchmarked with experimental data. In this work, prompt gamma-gamma coincidence measurements are presented as a way to validate DRF's for high-energy gamma rays
Campuzano Martinez, Ignacio Roberto; Gonzalez Vazquez, Alejandro Esteban; Robles Pimentel, Edgar Guillermo; Esparza Saucedo, Marcos; Garcia Martinez, Javier; Sanchez Flores, Ernesto; Martinez Romero, Jose Luis [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)
1998-12-31
The Hydroelectric Ing. Carlos Ramirez Ulloa Power Central has three 200 MW electric generators. The Central initiated its commercial operation in 1985. The electric generators had design problems that were properly corrected in an overhaul program that was initiated in 1996, with Unit 2 electric generator and completed in 1998 with Unit 1 electric generator. This paper presents the relevant aspects of the experience accumulated in the project. [Espanol] La central hidroelectrica Ing. Carlos Ramirez Ulloa cuenta con tres generadores de 200 MW cada uno. La central inicio su operacion comercial en 1985. Los generadores tenian problemas de diseno que fueron debidamente corregidos en un programa de rehabilitacion que inicio en 1996, con el generador de la unidad 2, y culmino en 1998 con el generador de la unidad 1. En este articulo se presentan los aspectos relevantes de la experiencia acumulada en el proyecto.
Blom, H.A.P.; Krystul, J.; Bakker, G.J.
2006-01-01
We study the problem of estimating small reachability probabilities for large scale stochastic hybrid processes through Sequential Monte Carlo (SMC) simulation. Recently, [Cerou et al., 2002, 2005] developed an SMC approach for diffusion processes, and referred to the resulting SMC algorithm as an I
Effective Spectral Function for Neutrino Quasielastic Scattering Event Generators
Coopersmith, Brian; Bodek, Arie; Christy, M. Eric
2014-03-01
The spectral functions that are used in modeling of quasi elastic scattering in neutrino event generators such as GENIE, NEUT, NUANCE and NUWRO event generators include (Global) Fermi gas, local Fermi gas, Bodek-Ritche Fermi gas with high momentum tail, and the Benhar Fantoni spectral function. We find that these spectral functions do not agree with the prediction of ψ' superscaling functions that are extracted from electron quasi elastic scattering data on nuclear targets. It is known that spectral functions do not fully describe quasi elastic scattering because they only model the initial state. Final state interactions distort the shape of the quasi elastic peak, reduce the cross section at the peak and increase the cross section at the tail of the distribution for large energy transfer to final state nucleons. We show that an ``effective spectral function'' can be constructed to reliably reproduce the kinematic distributions predicted by the ψ' super scaling formalism.
Wilson, J. A.; Richardson, J. A.
2015-12-01
Traditional methods used to calculate recurrence rate of volcanism, such as linear regression, maximum likelihood and Weibull-Poisson distributions, are effective at estimating recurrence rate and confidence level, but these methods are unable to estimate uncertainty in recurrence rate through time. We propose a new model for estimating recurrence rate and uncertainty, Volcanic Event Recurrence Rate Model. VERRM is an algorithm that incorporates radiometric ages, volcanic stratigraphy and paleomagnetic data into a Monte Carlo simulation, generating acceptable ages for each event. Each model run is used to calculate recurrence rate using a moving average window. These rates are binned into discrete time intervals and plotted using the 5th, 50th and 95th percentiles. We present recurrence rates from Cima Volcanic Field (CA), Yucca Mountain (NV) and Arsia Mons (Mars). Results from Cima Volcanic Field illustrate how several K-Ar ages with large uncertainties obscure three well documented volcanic episodes. Yucca Mountain results are similar to published rates and illustrate the use of using the same radiometric age for multiple events in a spatially defined cluster. Arsia Mons results show a clear waxing/waning of volcanism through time. VERRM output may be used for a spatio-temporal model or to plot uncertainty in quantifiable parameters such as eruption volume or geochemistry. Alternatively, the algorithm may be reworked to constrain geomagnetic chrons. VERRM is implemented in Python 2.7 and takes advantage of NumPy, SciPy and matplotlib libraries for optimization and quality plotting presentation. A typical Monte Carlo simulation of 40 volcanic events takes a few minutes to couple hours to complete, depending on the bin size used to assign ages.
HYDRO + JETS (HYDJET++) event generator for Pb+Pb collisions at LHC
Bravina, L; Crkovská, J; Eyyubova, G; Korotkikh, V; Lokhtin, I; Malinina, L; Nazarova, E; Petrushanko, S; Snigirev, A; Zabrodin, E
2016-01-01
The Monte Carlo event generator HYDJET++ is one of the few generators, designed for the calculations of heavy-ion collisions at ultrarelativistic energies, which combine treatment of soft hydro-like processes with the description of jets traversing the hot and dense partonic medium. The model is employed to study the azimuthal anisotropy phenomena, dihadron angular correlations and event-by-event (EbyE) fluctuations of the anisotropic flow in Pb+Pb collisions at $\\sqrt{s_{NN}} = 2.76$ TeV. The interplay of soft and hard processes describes the violation of the mass hierarchy of meson and baryon elliptic and triangular flows at p_T > 2 GeV/c, the fall-off of the flow harmonics at intermediate transverse momenta, and the worsening of the number-of-constituent-quark (NCQ) scaling of elliptic/triangular flow at LHC compared to RHIC energies. The cross-talk of v_2 and v_3 leads to emergence of higher order harmonics in the model and to appearance of the ridge structure in dihadron angular correlations in a broad p...
Event trees and dynamic event trees: Applications to steam generator tube rupture accidents
The dynamic event tree analysis method (DETAM) is a simulation based approach that models the integrated, dynamic response of the plant/operating crew system to an accident. It extends the conventional event tree/fault tree methodology for accident sequence analysis in two ways. First, it allows for tree branchings at discrete points in time. Second, the tree sequences explicitly track changes in the operating crew state, as well as changes in the plant hardware state. Process variable calculations and operating procedures are used in linking the crew and hardware behaviour. - The paper compares the conventional event tree/fault tree methodology for accident sequence analysis with the dynamic event tree method in the analysis of a pressurized water reactor steam generator tube rupture. Two previous PSA analyses are used for the comparison. The first employs the ''event tree with boundary conditions'' approach and uses fairly detailed top event headings. The second employs the ''linked fault tree'' approach and uses a relatively small event tree. - A quantitative comparison of the results of the three analyses shows that, in this particularly case study, the DETAM results appear to be less conservative. This is due, in part, to DETAM's treatment of recovery actions embedded in the emergency operating procedures. The quantitative results, however, should be viewed with some caution, since: (a) the three analyses have different scopes and employ different assumptions, and (b) a number of the parameters used in the DETAM analysis are highly uncertain. - A qualitative comparison of results shows that the dominant sequences predicted by each methodology are similar. However, the DETAM scenario descriptions are more detailed and allow better definition of steps to reduce risk. Further, the DETAM models deal with the variety of human error forms and their consequences; this provides a better capability of identifying and quantifying complex accident scenarios that may not
Tests on novel pseudo-potentials generated from diffusion Monte Carlo data.
Reboredo, Fernando; Hood, Randolph; Bajdich, Michal
2012-02-01
Since Dmitri Mendeleev developed a table in 1869 to illustrate recurring ("periodic") trends of the elements, it has been understood that most chemical and physical properties can be described by taking into account the outer most electrons of the atoms. These valence electrons are mainly responsible for the chemical bond. In many ab-initio approaches only valence electrons are taken into account and a pseudopotential is used to mimic the response of the core electrons. Typically an all-electron calculation is used to generate a pseudopotential that is used either within density functional theory or quantum chemistry approaches. In this talk we explain and demonstrate a new method to generate pseudopotentials directly from all-electron many-body diffusion Monte Carlo (DMC) calculations and discuss the results of of the transferability of these pseudopotentials. The advantages of incorporating the exchange and correlation directly from DMC into the pseudopotential are also discussed.
Tool for Generating Realistic Residential Hot Water Event Schedules: Preprint
Hendron, B.; Burch, J.; Barker, G.
2010-08-01
The installed energy savings for advanced residential hot water systems can depend greatly on detailed occupant use patterns. Quantifying these patterns is essential for analyzing measures such as tankless water heaters, solar hot water systems with demand-side heat exchangers, distribution system improvements, and recirculation loops. This paper describes the development of an advanced spreadsheet tool that can generate a series of year-long hot water event schedules consistent with realistic probability distributions of start time, duration and flow rate variability, clustering, fixture assignment, vacation periods, and seasonality. This paper also presents the application of the hot water event schedules in the context of an integral-collector-storage solar water heating system in a moderate climate.
Event-driven Monte Carlo: Exact dynamics at all time scales for discrete-variable models
Mendoza-Coto, Alejandro; Díaz-Méndez, Rogelio; Pupillo, Guido
2016-06-01
We present an algorithm for the simulation of the exact real-time dynamics of classical many-body systems with discrete energy levels. In the same spirit of kinetic Monte Carlo methods, a stochastic solution of the master equation is found, with no need to define any other phase-space construction. However, unlike existing methods, the present algorithm does not assume any particular statistical distribution to perform moves or to advance the time, and thus is a unique tool for the numerical exploration of fast and ultra-fast dynamical regimes. By decomposing the problem in a set of two-level subsystems, we find a natural variable step size, that is well defined from the normalization condition of the transition probabilities between the levels. We successfully test the algorithm with known exact solutions for non-equilibrium dynamics and equilibrium thermodynamical properties of Ising-spin models in one and two dimensions, and compare to standard implementations of kinetic Monte Carlo methods. The present algorithm is directly applicable to the study of the real-time dynamics of a large class of classical Markovian chains, and particularly to short-time situations where the exact evolution is relevant.
mFOAM-1.02: A Compact Version of the Cellular Event Generator FOAM
Jadach, S
2007-01-01
The general-purpose self-adapting Monte Carlo (MC) event generator/simulator mFOAM (standing for mini-FOAM) is a new compact version of the FOAM program, with a slightly limited functionality with respect to its parent version. On the other hand, mFOAM is easier to use for the average user. This new version is fully integrated with the ROOT package, the C++ utility library used widely in the particle physics community. The internal structure of the code is simplified and the very valuable feature of the persistency of the objects of the mFOAM class is improved. With the persistency at hand, it is possible to record very easily the complete state of a MC simulator object based on mFOAM and ROOT into a disk-file at any stage of its use: just after object allocation, after full initialization (exploration of the distribution), or at any time during the generation of the long series of MC events. Later on the MC simulator object can be easily restored from the disk-file in the ``ready to go'' state. Objects of TF...
GENEVE: a Monte Carlo generator for neutrino interactions in the intermediate energy range
GENEVE is a MonteCarlo code developed during the last few years inside the ICARUS Collaboration. It describes neutrino interactions on nuclear target in the 'intermediate energy range' and therefore is well suited for simulation of atmospheric neutrino scattering. We provide here few indications about the models adopted for the simulation of quasi-elastic interactions and of scattering processes proceeding via nucleon resonances excitation and decay. The code has been tested with comparisons with available data and an overall agreement turns out to be achieved. A gradual upgrade of the code is indeed necessary, according to many indications, reviewed during this Workshop, from more recent theoretical developments and experimental hints. More in general, the definitive assessment of a canonical Monte Carlo code for neutrino physics (in the intermediate energy range) has been identified as one of the most urgent task for a fully comprehensive understanding of the neutrino oscillation phenomenon. We believe that the only way to proceed relies on the forthcoming results of present and future generations of experiments, performed with best suited, available technologies, aiming to precise neutrino cross section measurements
In discrete detector PET, natural pixels are image basis functions calculated from the response of detector pairs. By using reconstruction with natural pixel basis functions, the discretization of the object into a predefined grid can be avoided. Here, we propose to use generalized natural pixel reconstruction. Using this approach, the basis functions are not the detector sensitivity functions as in the natural pixel case but uniform parallel strips. The backprojection of the strip coefficients results in the reconstructed image. This paper proposes an easy and efficient way to generate the matrix M directly by Monte Carlo simulation. Elements of the generalized natural pixel system matrix are formed by calculating the intersection of a parallel strip with the detector sensitivity function. These generalized natural pixels are easier to use than conventional natural pixels because the final step from solution to a square pixel representation is done by simple backprojection. Due to rotational symmetry in the PET scanner, the matrix M is block circulant and only the first blockrow needs to be stored. Data were generated using a fast Monte Carlo simulator using ray tracing. The proposed method was compared to a listmode MLEM algorithm, which used ray tracing for doing forward and backprojection. Comparison of the algorithms with different phantoms showed that an improved resolution can be obtained using generalized natural pixel reconstruction with accurate system modelling. In addition, it was noted that for the same resolution a lower noise level is present in this reconstruction. A numerical observer study showed the proposed method exhibited increased performance as compared to a standard listmode EM algorithm. In another study, more realistic data were generated using the GATE Monte Carlo simulator. For these data, a more uniform contrast recovery and a better contrast-to-noise performance were observed. It was observed that major improvements in contrast
Generation reliability assessment in oligopoly power market using Monte Carlo simulation
This paper addressed issues regarding power generation reliability assessment (HLI) in deregulated power pool markets. Most HLI reliability evaluation methods are based on the loss of load (LOLE) approach which is among the most suitable indices to describe the level of generation reliability. LOLE refers to the time in which load is greater than the amount of available generation. While most reliability assessments deal only with power system constraints, this study considered HLI reliability assessment in an oligopoly power market using Monte Carlo simulation (MCS). It evaluated the sensitivity of the reliability index to different reserve margins and future margins. The reliability index was determined by intersecting the offer and demand curves of power plants and comparing them to other parameters. The paper described the fundamentals of an oligopoly power pool market and proposed an algorithm for HLI reliability assessment for such a market. The proposed method was assessed on the IEEE-Reliability Test System with satisfactory results. In all cases, generation reliability indices were evaluated with different reserve margins and various load levels. 19 refs., 7 figs., 1 appendix
Generation reliability assessment in oligopoly power market using Monte Carlo simulation
Haroonabadi, H. [Islamic Azad Univ., Islamshahr (Iran, Islamic Republic of). Dept. of Electrical Engineering; Haghifam, M.R. [Tarbiat Modares Univ., Tehran (Iran, Islamic Republic of). Dept. of Electrical and Computer Engineering
2007-07-01
This paper addressed issues regarding power generation reliability assessment (HLI) in deregulated power pool markets. Most HLI reliability evaluation methods are based on the loss of load (LOLE) approach which is among the most suitable indices to describe the level of generation reliability. LOLE refers to the time in which load is greater than the amount of available generation. While most reliability assessments deal only with power system constraints, this study considered HLI reliability assessment in an oligopoly power market using Monte Carlo simulation (MCS). It evaluated the sensitivity of the reliability index to different reserve margins and future margins. The reliability index was determined by intersecting the offer and demand curves of power plants and comparing them to other parameters. The paper described the fundamentals of an oligopoly power pool market and proposed an algorithm for HLI reliability assessment for such a market. The proposed method was assessed on the IEEE-Reliability Test System with satisfactory results. In all cases, generation reliability indices were evaluated with different reserve margins and various load levels. 19 refs., 7 figs., 1 appendix.
On the use of SERPENT Monte Carlo code to generate few group diffusion constants
Piovezan, Pamela, E-mail: pamela.piovezan@ctmsp.mar.mil.b [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Sao Paulo, SP (Brazil); Carluccio, Thiago; Domingos, Douglas Borges; Rossi, Pedro Russo; Mura, Luiz Felipe, E-mail: fermium@cietec.org.b, E-mail: thiagoc@ipen.b [Fermium Tecnologia Nuclear, Sao Paulo, SP (Brazil); Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)
2011-07-01
The accuracy of diffusion reactor codes strongly depends on the quality of the groups constants processing. For many years, the generation of such constants was based on 1-D infinity cell transport calculations. Some developments using collision probability or the method of characteristics allow, nowadays, 2-D assembly group constants calculations. However, these 1-D and 2-D codes how some limitations as , for example, on complex geometries and in the neighborhood of heavy absorbers. On the other hand, since Monte Carlos (MC) codes provide accurate neutro flux distributions, the possibility of using these solutions to provide group constants to full-core reactor diffusion simulators has been recently investigated, especially for the cases in which the geometry and reactor types are beyond the capability of the conventional deterministic lattice codes. The two greatest difficulties on the use of MC codes to group constant generation are the computational costs and the methodological incompatibility between analog MC particle transport simulation and deterministic transport methods based in several approximations. The SERPENT code is a 3-D continuous energy MC transport code with built-in burnup capability that was specially optimized to generate these group constants. In this work, we present the preliminary results of using the SERPENT MC code to generate 3-D two-group diffusion constants for a PWR like assembly. These constants were used in the CITATION diffusion code to investigate the effects of the MC group constants determination on the neutron multiplication factor diffusion estimate. (author)
Combining Stochastics and Analytics for a Fast Monte Carlo Decay Chain Generator
Kazkaz, Kareem
2011-01-01
Various Monte Carlo programs, developed either by small groups or widely available, have been used to calculate the effects of decays of radioactive chains, from the original parent nucleus to the final stable isotopes. These chains include uranium, thorium, radon, and others, and generally have long-lived parent nuclei. Generating decays within these chains requires a certain amount of computing overhead related to simulating unnecessary decays, time-ordering the final results in post-processing, or both. We present a combination analytic/stochastic algorithm for creating a time-ordered set of decays with position and time correlations, and starting with an arbitrary source age. Thus the simulation costs are greatly reduced, while at the same time avoiding chronological post-processing. We discuss optimization methods within the approach to minimize calculation time.
Les Houches guidebook to Monte Carlo generators for hadron collider physics
Recently the collider physics community has seen significant advances in the formalisms and implementations of event generators. This review is a primer of the methods commonly used for the simulation of high energy physics events at particle colliders. We provide brief descriptions, references, and links to the specific computer codes which implement the methods. The aim is to provide an overview of the available tools, allowing the reader to ascertain which tool is best for a particular application, but also making clear the limitations of each tool
Les Houches Guidebook to Monte Carlo generators for hadron collider physics
Dobbs, M.A
2004-08-24
Recently the collider physics community has seen significant advances in the formalisms and implementations of event generators. This review is a primer of the methods commonly used for the simulation of high energy physics events at particle colliders. We provide brief descriptions, references, and links to the specific computer codes which implement the methods. The aim is to provide an overview of the available tools, allowing the reader to ascertain which tool is best for a particular application, but also making clear the limitations of each tool.
Les Houches guidebook to Monte Carlo generators for hadron collider physics
Dobbs, Matt A.; Frixione, Stefano; Laenen, Eric; Tollefson, Kirsten
2004-03-01
Recently the collider physics community has seen significant advances in the formalisms and implementations of event generators. This review is a primer of the methods commonly used for the simulation of high energy physics events at particle colliders. We provide brief descriptions, references, and links to the specific computer codes which implement the methods. The aim is to provide an overview of the available tools, allowing the reader to ascertain which tool is best for a particular application, but also making clear the limitations of each tool.
On the use of the Serpent Monte Carlo code for few-group cross section generation
Research highlights: → B1 methodology was used for generation of leakage-corrected few-group cross sections in the Serpent Monte-Carlo code. → Few-group constants generated by Serpent were compared with those calculated by Helios deterministic lattice transport code. → 3D analysis of a PWR core was performed by a nodal diffusion code DYN3D employing two-group cross section sets generated by Serpent and Helios. → An excellent agreement in the results of 3D core calculations obtained with Helios and Serpent generated cross-section libraries was observed. - Abstract: Serpent is a recently developed 3D continuous-energy Monte Carlo (MC) reactor physics burnup calculation code. Serpent is specifically designed for lattice physics applications including generation of homogenized few-group constants for full-core core simulators. Currently in Serpent, the few-group constants are obtained from the infinite-lattice calculations with zero neutron current at the outer boundary. In this study, in order to account for the non-physical infinite-lattice approximation, B1 methodology, routinely used by deterministic lattice transport codes, was considered for generation of leakage-corrected few-group cross sections in the Serpent code. A preliminary assessment of the applicability of the B1 methodology for generation of few-group constants in the Serpent code was carried out according to the following steps. Initially, the two-group constants generated by Serpent were compared with those calculated by Helios deterministic lattice transport code. Then, a 3D analysis of a Pressurized Water Reactor (PWR) core was performed by the nodal diffusion code DYN3D employing two-group cross section sets generated by Serpent and Helios. At this stage thermal-hydraulic (T-H) feedback was neglected. The DYN3D results were compared with those obtained from the 3D full core Serpent MC calculations. Finally, the full core DYN3D calculations were repeated taking into account T-H feedback and
Next-generation navigational infrastructure and the ATLAS event store
The ATLAS event store employs a persistence framework with extensive navigational capabilities. These include real-time back navigation to upstream processing stages, externalizable data object references, navigation from any data object to any other both within a single file and across files, and more. The 2013-2014 shutdown of the Large Hadron Collider provides an opportunity to enhance this infrastructure in several ways that both extend these capabilities and allow the collaboration to better exploit emerging computing platforms. Enhancements include redesign with efficient file merging in mind, content-based indices in optimized reference types, and support for forward references. The latter provide the potential to construct valid references to data before those data are written, a capability that is useful in a variety of multithreading, multiprocessing, distributed processing, and deferred processing scenarios. This paper describes the architecture and design of the next generation of ATLAS navigational infrastructure.
Next-Generation Navigational Infrastructure and the ATLAS Event Store
van Gemmeren, P; The ATLAS collaboration; Nowak, M
2014-01-01
The ATLAS event store employs a persistence framework with extensive navigational capabilities. These include real-time back navigation to upstream processing stages, externalizable data object references, navigation from any data object to any other both within a single file and across files, and more. The 2013-2014 shutdown of the Large Hadron Collider provides an opportunity to enhance this infrastructure in several ways that both extend these capabilities and allow the collaboration to better exploit emerging computing platforms. Enhancements include redesign with efficient file merging in mind, content-based indices in optimized reference types, and support for forward references. The latter provide the potential to construct valid references to data before those data are written, a capability that is useful in a variety of multithreading, multiprocessing, distributed processing, and deferred processing scenarios. This paper describes the architecture and design of the next generation of ATLAS navigation...
Next-Generation Navigational Infrastructure and the ATLAS Event Store
van Gemmeren, P; The ATLAS collaboration; Nowak, M
2013-01-01
The ATLAS event store employs a persistence framework with extensive navigational capabilities. These include real-time back navigation to upstream processing stages, externalizable data object references, navigation from any data object to any other both within a single file and across files, and more. The 2013-2014 shutdown of the Large Hadron Collider provides an opportunity to enhance this infrastructure in several ways that both extend these capabilities and allow the collaboration to better exploit emerging computing platforms. Enhancements include redesign with efficient file merging in mind, content-based indices in optimized reference types, and support for forward references. The latter provide the potential to construct valid references to data before those data are written, a capability that is useful in a variety of multithreading, multiprocessing, distributed processing, and deferred processing scenarios. This paper describes the architecture and design of the next generation of ATLAS navigation...
Automated Monte Carlo biasing for photon-generated electrons near surfaces.
Franke, Brian Claude; Crawford, Martin James; Kensek, Ronald Patrick
2009-09-01
This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.
A method for tuning parameters in Monte Carlo generators is described and applied to a specific case. The method works in the following way: each observable is generated several times using different values of the parameters to be tuned. The output is then approximated by some analytic form to describe the dependence of the observables on the parameters. This approximation is used to find the values of the parameter that give the best description of the experimental data. This results in significantly faster fitting compared to an approach in which the generator is called iteratively. As an application, we employ this method to fit the parameters of the unintegrated gluon density used in the Cascade Monte Carlo generator, using inclusive deep inelastic data measured by the H1 Collaboration. We discuss the results of the fit, its limitations, and its strong points. (orig.)
Flux Transfer Events: 1. generation mechanism for strong southward IMF
J. Raeder
2006-03-01
Full Text Available We use a global numerical model of the interaction of the solar wind and the interplanetary magnetic field with Earth's magnetosphere to study the formation process of Flux Transfer Events (FTEs during strong southward IMF. We find that: (i The model produces essentially all observational features expected for FTEs, in particular the bipolar signature of the magnetic field B_{N} component, the correct polarity, duration, and intermittency of that bipolar signature, strong core fields and enhanced core pressure, and flow enhancements; (ii FTEs only develop for large dipole tilt whereas in the case of no dipole tilt steady magnetic reconnection occurs at the dayside magnetopause; (iii the basic process by which FTEs are produced is the sequential generation of new X-lines which makes dayside reconnection inherently time dependent and leads to a modified form of dual or multiple X-line reconnection; (iv the FTE generation process in this model is not dependent on specific assumptions about microscopic processes; (v the average period of FTEs can be explained by simple geometric arguments involving magnetosheath convection; (vi FTEs do not develop in the model if the numerical resolution is too coarse leading to too much numerical diffusion; and (vii FTEs for nearly southward IMF and large dipole tilt, i.e., near solstice, should only develop in the winter hemisphere, which provides a testable prediction of seasonal modulation. The semiannual modulation of intermittent FTE reconnection versus steady reconnection is also expected to modulate magnetospheric and ionospheric convection and may thus contribute to the semiannual variation of geomagnetic activity.
Event generator analysis for singly strange particle production data
We have used the event generator LUCIAE to analyze the data of singly strange particle production in pp, pA, and AA reactions at 200A GeV and compared them with the corresponding experimental data and theoretical results of HIJING, VENUS, and RQMD. The results indicate that for reproducing the NA35 data of AA reactions it requires a somewhat larger s quark suppression factor (s=0.3) and shorter formation time (τ=1 fm/c) of produced particles than for reproducing the experimental data of pp and pA (s=0.2 and τ=1.5 fm/c). This might be understood intuitively from the relationship between the effective string tension and the violence of collision. However, the NA36 data of the negative multiplicity dependence of Λ in S + Pb reaction at 200 A GeV could not be well reproduced by LUCIAE, VENUS, or RQMD, and the NA36 data prefers model parameters of s=0.2 and τ=1.5 fm/c especially, this seems hard to understand and needs further studies. From comparing the NA35 data of the rapidity distribution of Λ in S + Ag reactions to the corresponding results of LUCIAE one knows that the effect of varying the s quark suppression factor from 0.2 to 0.3 is smaller than the effect of rescattering with respect to strangeness production. copyright 1997 The American Physical Society
A new version of the event generator Sibyll
Riehn, Felix; Fedynitch, Anatoli; Gaisser, Thomas K; Stanev, Todor
2015-01-01
The event generator Sibyll can be used for the simulation of hadronic multiparticle production up to the highest cosmic ray energies. It is optimized for providing an economic description of those aspects of the expected hadronic final states that are needed for the calculation of air showers and atmospheric lepton fluxes. New measurements from fixed target and collider experiments, in particular those at LHC, allow us to test the predictive power of the model version 2.1, which was released more than 10 years ago, and also to identify shortcomings. Based on a detailed comparison of the model predictions with the new data we revisit model assumptions and approximations to obtain an improved version of the interaction model. In addition a phenomenological model for the production of charm particles is implemented as needed for the calculation of prompt lepton fluxes in the energy range of the astrophysical neutrinos recently discovered by IceCube. After giving an overview of the new ideas implemented in Sibyll...
Molecular Characterization of Transgenic Events Using Next Generation Sequencing Approach
Mammadov, Jafar; Ye, Liang; Soe, Khaing; Richey, Kimberly; Cruse, James; Zhuang, Meibao; Gao, Zhifang; Evans, Clive; Rounsley, Steve; Kumpatla, Siva P.
2016-01-01
Demand for the commercial use of genetically modified (GM) crops has been increasing in light of the projected growth of world population to nine billion by 2050. A prerequisite of paramount importance for regulatory submissions is the rigorous safety assessment of GM crops. One of the components of safety assessment is molecular characterization at DNA level which helps to determine the copy number, integrity and stability of a transgene; characterize the integration site within a host genome; and confirm the absence of vector DNA. Historically, molecular characterization has been carried out using Southern blot analysis coupled with Sanger sequencing. While this is a robust approach to characterize the transgenic crops, it is both time- and resource-consuming. The emergence of next-generation sequencing (NGS) technologies has provided highly sensitive and cost- and labor-effective alternative for molecular characterization compared to traditional Southern blot analysis. Herein, we have demonstrated the successful application of both whole genome sequencing and target capture sequencing approaches for the characterization of single and stacked transgenic events and compared the results and inferences with traditional method with respect to key criteria required for regulatory submissions. PMID:26908260
Molecular Characterization of Transgenic Events Using Next Generation Sequencing Approach.
Guttikonda, Satish K; Marri, Pradeep; Mammadov, Jafar; Ye, Liang; Soe, Khaing; Richey, Kimberly; Cruse, James; Zhuang, Meibao; Gao, Zhifang; Evans, Clive; Rounsley, Steve; Kumpatla, Siva P
2016-01-01
Demand for the commercial use of genetically modified (GM) crops has been increasing in light of the projected growth of world population to nine billion by 2050. A prerequisite of paramount importance for regulatory submissions is the rigorous safety assessment of GM crops. One of the components of safety assessment is molecular characterization at DNA level which helps to determine the copy number, integrity and stability of a transgene; characterize the integration site within a host genome; and confirm the absence of vector DNA. Historically, molecular characterization has been carried out using Southern blot analysis coupled with Sanger sequencing. While this is a robust approach to characterize the transgenic crops, it is both time- and resource-consuming. The emergence of next-generation sequencing (NGS) technologies has provided highly sensitive and cost- and labor-effective alternative for molecular characterization compared to traditional Southern blot analysis. Herein, we have demonstrated the successful application of both whole genome sequencing and target capture sequencing approaches for the characterization of single and stacked transgenic events and compared the results and inferences with traditional method with respect to key criteria required for regulatory submissions. PMID:26908260
The neutron generation time Λ plays an important role in the reactor kinetics. However, it is not straightforward nor standard in most continuous energy Monte Carlo codes which are able to calculate the prompt neutron lifetime lp directly. The difference between Λ and lp are sometimes very apparent. As very few delayed neutrons are produced in the reactor, they have little influence on Λ. Thus on the assumption that no delayed neutrons are produced in the system, the prompt kinetics equations for critical system and subcritical system with an external source are proposed. And then the equations are applied to calculating Λ with pulsed neutron technique using Monte Carlo. Only one fission neutron source is simulated with Monte Carlo in critical system while two neutron sources, including a fission source and an external source, are simulated for subcritical system. Calculations are performed on both critical benchmarks and subcritical system with an external source and the results are consistent with the reference values. (author)
CODAC, Multigroup Cross-Sections Generation from ENDF/B for Monte-Carlo Program TIMOC
1 - Nature of physical problem solved: CODAC-2 is a nuclear data processing program. It converts ENDF/B version 2 data into group averaged cross sections in the form needed by Monte Carlo codes. CODAC generates the mean values of sigma-c,sigma-el,sigma-inel, sigma-f and nu for any group structure by using specified weighting spectra. In the case of anisotropic elastic scattering either the average cosine or the angular derivation of cross section is calculated for each energy group. The inelastic scattering is described by a transfer matrix which can also include (n,2n) reactions. 2 - Method of solution: Averaging is done by using a weighting spectrum given by an input. Group averaged cross sections are calculated by summing up the smooth contributions and the contributions of the resolved and unresolved resonances, using the method of the ETOG-ETOM program and of MC2 at zero temperature. Anisotropic elastic secondary angular distribution is calculated optionally as mu-average in laboratory system, Legendre expansion, or point by point along the mu-axis. The inelastic secondary energy distribution is computed as transfer matrix. 3 - Restrictions on the complexity of the problem: The programme handles any number of ENDF/B materials during one run. The number of energy groups is limited to 50. The output formats of CODAC-2 correspond to the input formats of TIMOC. The UNIVAC 1106 version was received from Junta de Energia Nuclear, Madrid, Spain
Monte Carlo efficiency calibration of a neutron generator-based total-body irradiator
Many body composition measurement systems are calibrated against a single-sized reference phantom. Prompt-gamma neutron activation (PGNA) provides the only direct measure of total body nitrogen (TBN), an index of the body's lean tissue mass. In PGNA systems, body size influences neutron flux attenuation, induced gamma signal distribution, and counting efficiency. Thus, calibration based on a single-sized phantom could result in inaccurate TBN values. We used Monte Carlo simulations (MCNP-5; Los Alamos National Laboratory) in order to map a system's response to the range of body weights (65-160 kg) and body fat distributions (25-60%) in obese humans. Calibration curves were constructed to derive body-size correction factors relative to a standard reference phantom, providing customized adjustments to account for differences in body habitus of obese adults. The use of MCNP-generated calibration curves should allow for a better estimate of the true changes in lean tissue mass that many occur during intervention programs focused only on weight loss. (author)
Corcella, Gennaro; Marchesini, G; Moretti, S; Odagiri, K; Richardson, Peter; Seymour, Michael H; Webber, Bryan R
2001-01-01
HERWIG is a general-purpose Monte Carlo event generator, which includes the simulation of hard lepton-lepton, lepton-hadron and hadron-hadron scattering and soft hadron-hadron collisions in one package. It uses the parton-shower approach for initial- and final-state QCD radiation, including colour coherence effects and azimuthal correlations both within and between jets. This article updates the description of HERWIG published in 1992, emphasising the new features incorporated since then. These include, in particular, the matching of first-order matrix elements with parton showers, a more correct treatment of heavy quark decays, and a wide range of new processes, including many predicted by the Minimal Supersymmetric Standard Model, with the option of R-parity violation. At the same time we offer a brief review of the physics underlying HERWIG together with details of the input and control parameters and the output data, to provide a self-contained guide for prospective users of the program.
Generation of scintigraphic images in a virtual dosimetry trial based on Monte Carlo modelling
Full text of publication follows. Aim: the purpose of dosimetry calculations in therapeutic nuclear medicine is to maximize tumour absorbed dose while minimizing normal tissue toxicities. However a wide heterogeneity of dosimetric approaches is observed: there is no standardized dosimetric protocol to date. The DosiTest project (www.dositest.com) intends to identify critical steps in the dosimetry chain by implementing clinical dosimetry in different Nuclear Medicine departments, on scintigraphic images generated by Monte Carlo simulation from a same virtual patient. This study aims at presenting the different steps contributing to image generation, following the imaging protocol of a given participating centre, Milan's European Institute of Oncology (IEO). Materiel and methods: the chosen clinical application is that of 111In-pentetreotide (OctreoscanTM). Pharmacokinetic data from the literature are used to derive a compartmental model. The kinetic rates between 6 compartments (liver, spleen, kidneys, blood, urine, remainder body) were obtained from WinSaam [3]: the activity in each compartment is known at any time point. The TestDose [1] software (computing architecture of DosiTest) implements the NURBS-based phantom NCAT-WB [2] to generate anatomical data for the virtual patient. IEO gamma-camera was modelled with GATE [4] v6.2. Scintigraphic images were simulated for each compartment and the resulting projections were weighted by the respective pharmacokinetics for each compartment. The final step consisted in aggregating each compartment to generate the resulting image. Results: following IEO's imaging protocol, planar and tomographic image simulations were generated at various time points. Computation times (on a 480 virtual cores computing cluster) for 'step and shoot' whole body simulations (5 steps/time point) and acceptable statistics were: 10 days for extra-vascular fluid, 28 h for blood, 12 h for liver, 7 h for kidneys, and 1-2 h for
Analysis of the Steam Generator Tubes Rupture Initiating Event
In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)
Top quark event modelling and generators in CMS
Bilin, Bugra
2016-01-01
State-of-the-art theoretical predictions accurate to next-to-leading order QCD interfaced with {\\sc pythia} and {\\sc herwig} are tested by comparing the unfolded $t\\bar{t}$ differential data collected with the CMS detector at 8 TeV and 13 TeV. These predictions are also compared with the measurements of underlying event activity distributions accompanying ${\\rm t\\bar{t}}$ events. Furthermore, predictions of beyond NLO accuracy in QCD are compared with the data.
The use of an inbuilt importance generator for acceleration of the Monte Carlo code MCBEND
Monte Carlo is currently the most accurate method for the analysis of neutron and gamma-ray transport. However its application, especially to deep penetration studies, is costly in terms of the man-days to set up the calculation and in terms of computer usage. The MAGIC module, developed at the Winfrith Technology Centre, addresses both these problems. It employs an automated procedure based upon the established technique of splitting/roulette with an importance function derived from the solution of the adjoint diffusion equation. Examples are given of the application of the module with Monte Carlo code MCBEND
Handling of the Generation of Primary Events in Gauss, the LHCb Simulation Framework
Corti, G; Brambach, T; Brook, N H; Gauvin, N; Harrison, K; Harrison, P; He, J; Ilten, P J; Jones, C R; Lieng, M H; Manca, G; Miglioranzi, S; Robbe, P; Vagnoni, V; Whitehead, M; Wishahi, J
2010-01-01
The LHCb simulation application, Gauss, consists of two independent phases, the generation of the primary event and the tracking of particles produced in the experimental setup. For the LHCb experimental program it is particularly important to model B meson decays: the EvtGen code developed in CLEO and BaBar has been chosen and customized for non coherent B production as occuring in pp collisions at the LHC. The initial proton-proton collision is provided by a different generator engine, currently Pythia 6 for massive production of signal and generic pp collisions events. Beam gas events, background events originating from proton halo, cosmics and calibration events for different detectors can be generated in addition to pp collisions. Different generator packages are available in the physics community or specifically developed in LHCb, and are used for the different purposes. Running conditions affecting the events generated such as the size of the luminous region, the number of collisions occuring in a bunc...
Monte Carlo primer for health physicists
The basic ideas and principles of Monte Carlo calculations are presented in the form of a primer for health physicists. A simple integral with a known answer is evaluated by two different Monte Carlo approaches. Random number, which underlie Monte Carlo work, are discussed, and a sample table of random numbers generated by a hand calculator is presented. Monte Carlo calculations of dose and linear energy transfer (LET) from 100-keV neutrons incident on a tissue slab are discussed. The random-number table is used in a hand calculation of the initial sequence of events for a 100-keV neutron entering the slab. Some pitfalls in Monte Carlo work are described. While this primer addresses mainly the bare bones of Monte Carlo, a final section briefly describes some of the more sophisticated techniques used in practice to reduce variance and computing time
Lai, Bo-Lun; Sheu, Rong-Jiun; Lin, Uei-Tyng
2015-05-01
Monte Carlo simulations are generally considered the most accurate method for complex accelerator shielding analysis. Simplified models based on point-source line-of-sight approximation are often preferable in practice because they are intuitive and easy to use. A set of shielding data, including source terms and attenuation lengths for several common targets (iron, graphite, tissue, and copper) and shielding materials (concrete, iron, and lead) were generated by performing Monte Carlo simulations for 100-300 MeV protons. Possible applications and a proper use of the data set were demonstrated through a practical case study, in which shielding analysis on a typical proton treatment room was conducted. A thorough and consistent comparison between the predictions of our point-source line-of-sight model and those obtained by Monte Carlo simulations for a 360° dose distribution around the room perimeter showed that the data set can yield fairly accurate or conservative estimates for the transmitted doses, except for those near the maze exit. In addition, this study demonstrated that appropriate coupling between the generated source term and empirical formulae for radiation streaming can be used to predict a reasonable dose distribution along the maze. This case study proved the effectiveness and advantage of applying the data set to a quick shielding design and dose evaluation for proton therapy accelerators. PMID:25811254
Srinivasan, P.; Priya, S.; Patel, Tarun; Gopalakrishnan, R. K.; Sharma, D. N.
2015-01-01
DD/DT fusion neutron generators are used as sources of 2.5 MeV/14.1 MeV neutrons in experimental laboratories for various applications. Detailed knowledge of the radiation dose rates around the neutron generators are essential for ensuring radiological protection of the personnel involved with the operation. This work describes the experimental and Monte Carlo studies carried out in the Purnima Neutron Generator facility of the Bhabha Atomic Research Center (BARC), Mumbai. Verification and validation of the shielding adequacy was carried out by measuring the neutron and gamma dose-rates at various locations inside and outside the neutron generator hall during different operational conditions both for 2.5-MeV and 14.1-MeV neutrons and comparing with theoretical simulations. The calculated and experimental dose rates were found to agree with a maximum deviation of 20% at certain locations. This study has served in benchmarking the Monte Carlo simulation methods adopted for shield design of such facilities. This has also helped in augmenting the existing shield thickness to reduce the neutron and associated gamma dose rates for radiological protection of personnel during operation of the generators at higher source neutron yields up to 1 × 1010 n/s.
DD/DT fusion neutron generators are used as sources of 2.5 MeV/14.1 MeV neutrons in experimental laboratories for various applications. Detailed knowledge of the radiation dose rates around the neutron generators are essential for ensuring radiological protection of the personnel involved with the operation. This work describes the experimental and Monte Carlo studies carried out in the Purnima Neutron Generator facility of the Bhabha Atomic Research Center (BARC), Mumbai. Verification and validation of the shielding adequacy was carried out by measuring the neutron and gamma dose-rates at various locations inside and outside the neutron generator hall during different operational conditions both for 2.5-MeV and 14.1-MeV neutrons and comparing with theoretical simulations. The calculated and experimental dose rates were found to agree with a maximum deviation of 20% at certain locations. This study has served in benchmarking the Monte Carlo simulation methods adopted for shield design of such facilities. This has also helped in augmenting the existing shield thickness to reduce the neutron and associated gamma dose rates for radiological protection of personnel during operation of the generators at higher source neutron yields up to 1 × 1010 n/s
A Monte-Carlo simulation code SEALER was developed for neutron-induced single event upset of semiconductor devices at the ground level, in which composite materials effects are fully simulated. Any size and structures of 8 composite material such as Si, SiO2, Si3N4, Ta2O5, WSi2, Cu, Al, TiN can be included for analyses of nuclear spallation reactions and charge collection to storage nodes. Some preliminary implications of composite material effects are demonstrated including an apparent contribution of elastic scattering to single event upset in lower energy region as low as 2 MeV or even lower. (author)
Jin, Shengye; Tamura, Masayuki
2013-10-01
Monte Carlo Ray Tracing (MCRT) method is a versatile application for simulating radiative transfer regime of the Solar - Atmosphere - Landscape system. Moreover, it can be used to compute the radiation distribution over a complex landscape configuration, as an example like a forest area. Due to its robustness to the complexity of the 3-D scene altering, MCRT method is also employed for simulating canopy radiative transfer regime as the validation source of other radiative transfer models. In MCRT modeling within vegetation, one basic step is the canopy scene set up. 3-D scanning application was used for representing canopy structure as accurately as possible, but it is time consuming. Botanical growth function can be used to model the single tree growth, but cannot be used to express the impaction among trees. L-System is also a functional controlled tree growth simulation model, but it costs large computing memory. Additionally, it only models the current tree patterns rather than tree growth during we simulate the radiative transfer regime. Therefore, it is much more constructive to use regular solid pattern like ellipsoidal, cone, cylinder etc. to indicate single canopy. Considering the allelopathy phenomenon in some open forest optical images, each tree in its own `domain' repels other trees. According to this assumption a stochastic circle packing algorithm is developed to generate the 3-D canopy scene in this study. The canopy coverage (%) and the tree amount (N) of the 3-D scene are declared at first, similar to the random open forest image. Accordingly, we randomly generate each canopy radius (rc). Then we set the circle central coordinate on XY-plane as well as to keep circles separate from each other by the circle packing algorithm. To model the individual tree, we employ the Ishikawa's tree growth regressive model to set the tree parameters including DBH (dt), tree height (H). However, the relationship between canopy height (Hc) and trunk height (Ht) is
Event-by-event generation of vorticity in heavy-ion collisions
Deng, Wei-Tian
2016-01-01
In a noncentral heavy-ion collision, the two colliding nuclei have finite angular momentum in the direction perpendicular to the reaction plane. After the collision, a fraction of the total angular momentum is retained in the produced hot quark-gluon matter and is manifested in the form of fluid shear. Such fluid shear creates finite flow vorticity. We study some features of such generated vorticity, including its strength, beam energy dependence, centrality dependence, and spatial distribution.
A Monte Carlo Generator for Full Simulation of e+e- to hadrons Cross section Scan Experiment
Zhang, D; Chen, J C
2006-01-01
A generator with $\\alpha^2$ order radiative correction effects, including both the initial state radiative corrections and the photon vacuum polarization corrections, is built for the full simulation of $e^+e^- \\to hadrons $ in cross section scan experiment in the quarkonium energy region. The contributions from the hadron production structures including the resonances of $1^{--}$ quarkonium families and the light hadrons spectrum below 2 GeV as well as the QED continuum hadron spectrum are all taken into account in the event generation. It was employed successfully to determine the detection efficiency for selection of the events of $e^+e^-\\to hadrons$ from the data taken in the energy region from 3.650 GeV to 3.872 GeV covering both the $\\psi(3686)$ and $\\psi(3770)$ resonances in BES experiment. The generator reproduces the properties of hadronic event production and inclusive decays well.
Physics cross sections and event generation of e+e- annihilations at the CEPC
Mo, Xin; Li, Gang; Ruan, Man-Qi; Lou, Xin-Chou
2016-03-01
The cross sections of the Higgs production and the corresponding backgrounds of e+e- annihilations at the CEPC (Circular Electron and Positron Collider) are calculated by a Monte-Carlo method, and the beamstrahlung effect at the CEPC is carefully investigated. The numerical results and the expected number of events for the CEPC are provided. Supported by CAS/SAFEA International Partnership Program for Creative Research Teams, and funding from CAS and IHEP for the Thousand Talent and Hundred Talent programs, as well as grants from the State Key Laboratory of Nuclear Electronics and Particle Detectors
Moss, Gregory D.; Pasko, Victor P.; Liu, Ningyu; Veronis, Georgios
2006-02-01
Streamers are thin filamentary plasmas that can initiate spark discharges in relatively short (several centimeters) gaps at near ground pressures and are also known to act as the building blocks of streamer zones of lightning leaders. These streamers at ground pressure, after 1/N scaling with atmospheric air density N, appear to be fully analogous to those documented using telescopic imagers in transient luminous events (TLEs) termed sprites, which occur in the altitude range 40-90 km in the Earth's atmosphere above thunderstorms. It is also believed that the filamentary plasma structures observed in some other types of TLEs, which emanate from the tops of thunderclouds and are termed blue jets and gigantic jets, are directly linked to the processes in streamer zones of lightning leaders. Acceleration, expansion, and branching of streamers are commonly observed for a wide range of applied electric fields. Recent analysis of photoionization effects on the propagation of streamers indicates that very high electric field magnitudes ˜10 Ek, where Ek is the conventional breakdown threshold field defined by the equality of the ionization and dissociative attachment coefficients in air, are generated around the tips of streamers at the stage immediately preceding their branching. This paper describes the formulation of a Monte Carlo model, which is capable of describing electron dynamics in air, including the thermal runaway phenomena, under the influence of an external electric field of an arbitrary strength. Monte Carlo modeling results indicate that the ˜10 Ek fields are able to accelerate a fraction of low-energy (several eV) streamer tip electrons to energies of ˜2-8 keV. With total potential differences on the order of tens of MV available in streamer zones of lightning leaders, it is proposed that during a highly transient negative corona flash stage of the development of negative stepped leader, electrons with energies 2-8 keV ejected from streamer tips near
This paper presents a novel decision-support tool for assessing future generation portfolios in an increasingly uncertain electricity industry. The tool combines optimal generation mix concepts with Monte Carlo simulation and portfolio analysis techniques to determine expected overall industry costs, associated cost uncertainty, and expected CO2 emissions for different generation portfolio mixes. The tool can incorporate complex and correlated probability distributions for estimated future fossil-fuel costs, carbon prices, plant investment costs, and demand, including price elasticity impacts. The intent of this tool is to facilitate risk-weighted generation investment and associated policy decision-making given uncertainties facing the electricity industry. Applications of this tool are demonstrated through a case study of an electricity industry with coal, CCGT, and OCGT facing future uncertainties. Results highlight some significant generation investment challenges, including the impacts of uncertain and correlated carbon and fossil-fuel prices, the role of future demand changes in response to electricity prices, and the impact of construction cost uncertainties on capital intensive generation. The tool can incorporate virtually any type of input probability distribution, and support sophisticated risk assessments of different portfolios, including downside economic risks. It can also assess portfolios against multi-criterion objectives such as greenhouse emissions as well as overall industry costs. - Highlights: ► Present a decision support tool to assist generation investment and policy making under uncertainty. ► Generation portfolios are assessed based on their expected costs, risks, and CO2 emissions. ► There is tradeoff among expected cost, risks, and CO2 emissions of generation portfolios. ► Investment challenges include economic impact of uncertainties and the effect of price elasticity. ► CO2 emissions reduction depends on the mix of
Mercurio, D; Podofillini, L; Zio, E; Dang, V N
2009-11-01
This paper illustrates a method to identify and classify scenarios generated in a dynamic event tree (DET) analysis. Identification and classification are carried out by means of an evolutionary possibilistic fuzzy C-means clustering algorithm which takes into account not only the final system states but also the timing of the events and the process evolution. An application is considered with regards to the scenarios generated following a steam generator tube rupture in a nuclear power plant. The scenarios are generated by the accident dynamic simulator (ADS), coupled to a RELAP code that simulates the thermo-hydraulic behavior of the plant and to an operators' crew model, which simulates their cognitive and procedures-guided responses. A set of 60 scenarios has been generated by the ADS DET tool. The classification approach has grouped the 60 scenarios into 4 classes of dominant scenarios, one of which was not anticipated a priori but was "discovered" by the classifier. The proposed approach may be considered as a first effort towards the application of identification and classification approaches to scenarios post-processing for real-scale dynamic safety assessments. PMID:19819366
Detecting complex events in user-generated video using concept classifiers
Guo, Jinlin; Scott, David; Hopfgartner, Frank; Gurrin, Cathal
2012-01-01
Automatic detection of complex events in user-generated videos (UGV) is a challenging task due to its new characteristics differing from broadcast video. In this work, we firstly summarize the new characteristics of UGV, and then explore how to utilize concept classifiers to recognize complex events in UGV content. The method starts from manually selecting a variety of relevant concepts, followed byconstructing classifiers for these concepts. Finally, complex event detectors are learned by...
Effects of self-generated versus experimenter-provided cues on the representation of future events.
Neroni, Maria Adriana; Gamboz, Nadia; de Vito, Stefania; Brandimonte, Maria Antonella
2016-01-01
Most experimental studies of prospection focused on episodic forms of future events prompted by means of verbal cues. However, there is evidence suggesting that future events differ considerably according to whether they are produced in response to external, experimenter-provided verbal cues or they are self-generated. In the present study, we compared the quality, the phenomenal characteristics, the temporal distribution, and the content of imagined events prompted by experimenter-provided cues (i.e., cue-words and short verbal sentences) or elicited by means of verbal cues that were self-generated in an autobiographical fluency task. The results showed that future events prompted by means of self-generated cues contained fewer event-specific details than future events prompted by experimenter-provided cues. However, future events elicited by means of self-generated and by experimenter-provided cues did not differ with respect to their phenomenal characteristics. The temporal distribution and the thematic content of future representations were also affected by the type of cue used to elicit prospection. These results offer a holistic view of the properties of future thinking and suggest that the content and the characteristics of envisioned future events may be affected by the method used to elicit prospection. PMID:26444043
Event-by-event cluster analysis of final states from heavy ion collisions
Fialkowski, K.; Wit, R.
1999-01-01
We present an event-by-event analysis of the cluster structure of final multihadron states resulting from heavy ion collisions. A comparison of experimental data with the states obtained from Monte Carlo generators is shown. The analysis of the first available experimental events suggests that the method is suitable for selecting some different types of events.
Generation of organic scintillators response function for fast neutrons using the Monte Carlo method
A computer program (DALP) in Fortran-4-G language, has been developed using the Monte Carlo method to simulate the experimental techniques leading to the distribution of pulse heights due to monoenergetic neutrons reaching an organic scintillator. The calculation of the pulse height distribution has been done for two different systems: 1) Monoenergetic neutrons from a punctual source reaching the flat face of a cylindrical organic scintillator; 2) Environmental monoenergetic neutrons randomly reaching either the flat or curved face of the cylindrical organic scintillator. The computer program has been developed in order to be applied to the NE-213 liquid organic scintillator, but can be easily adapted to any other kind of organic scintillator. With this program one can determine the pulse height distribution for neutron energies ranging from 15 KeV to 10 MeV. (Author)
Anna C Phillips; Carroll, Douglas; Van, Geoffrey
2015-01-01
Background and Objectives: Stressful life events are known to contribute to development of depression, however, it is possible this link is bi-directional. The present study examined whether such stress generation effects are greater than the effects of stressful life events on depression, and whether stress generation is also evident with anxiety. Design: Participants were two large age cohorts (N = 732 aged 44 years; N = 705 aged 63 years) from the West of Scotland Twenty-07 study. Methods:...
Single-event upset (SEU) is triggered when an amount of electric charges induced by energetic ion incidence exceeds a value known as a critical charge in a very short time period. Therefore, accurate evaluation of electric charge and understanding of basic mechanism of SEU are necessary for the improvement of SEU torrance of electronic devices. In this paper, the collected charges for the single event transient current induced on semiconductor by heavy ion microbeams, and application to use microbeam for single event studies are presented. (author)
In conventional PET systems,the parallax error degrades image resolution and causes image distortion. To remedy this, a PET ring diameter has to be much larger than the required size of field of view (FOV), and therefore the cost goes up. Measurement of depth-of-interaction (DOI) information is effective to reduce the parallax error and improve the image quality. This study is aimed at developing a practical method to incorporate DOI information in PET sonogram generation and image reconstruction processes and evaluate its efficacy through Monte Carlo simulation. An animal PET system with 30-mm long LSO crystals and 2-mm DOI measurement accuracy was simulated and list-mode PET data were collected. A sonogram generation method was proposed to bin each coincidence event to the correct LOR location according to both incident crystal indices and DOI positions of the two annihilation photons. The sonograms were reconstructed with an iterative OSMAPEM (ordered subset maximum a posteriori expectation maximization) algorithm. Two phantoms (a rod source phantom and a Derenzo phantom) were simulated, and the benefits of DOI were investigated in terms of reconstructed source diameter (FWHM) and source positioning accuracy. The results demonstrate that the proposed method works well to incorporate DOI information in data processing, which not only overcomes the image distortion problem but also significantly improves image resolution and resolution uniformity and results in satisfactory image quality. (authors)
Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, 31062 Toulouse (France); McKay, Erin [St George Hospital, Gray Street, Kogarah, New South Wales 2217 (Australia); Ferrer, Ludovic [ICO René Gauducheau, Boulevard Jacques Monod, St Herblain 44805 (France); Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila [European Institute of Oncology, Via Ripamonti 435, Milano 20141 (Italy); Bardiès, Manuel [UMR 1037 INSERM/UPS, CRCT, 133 Route de Narbonne, Toulouse 31062 (France)
2015-12-15
Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry
Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry
The OPERA event generator and the data tuning of nuclear re-interactions
The OPERA event generator 'NEGN' is based on an adapted version of the event generator developed for the NOMAD experiment [M. Veltri, presentation at the NUINT01 conference]. It includes all the features implemented in the NOMAD generator which were cross-checked with a large sample of neutrino interactions, finely reconstructed at the level of single particles. This sample allowed for a study of the hadronic system fragmentation and to tune the inclusion of nuclear effects, like the intra-nuclear cascades. Many of these effects are also relevant, for different reasons, in the OPERA simulation
Donadel, Clainer Bravin; Fardin, Jussara Farias; Encarnação, Lucas Frizera
2015-10-01
In the literature, several papers propose new methodologies to determine the optimal placement/sizing of medium size Distributed Generation Units (DGs), using heuristic algorithms like Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). However, in all methodologies, the optimal placement solution is strongly dependent of network topologies. Therefore, a specific solution is valid only for a particular network topology. Furthermore, such methodologies does not consider the presence of small DGs, whose connection point cannot be defined by Distribution Network Operators (DNOs). In this paper it is proposed a new methodology to determine the optimal location of medium size DGs in a distribution system with uncertain topologies, considering the particular behavior of small DGs, using Monte Carlo Simulation.
Time-domain electronic-noise generation for a Monte Carlo study of signal processing
A general technique to generate electronic noise was developed in a time-domain approach with the aid of a deconvolution technique in a discrete-time sampling system. We found that the technique is applicable even for a system in which the electronic-noise charge is not well defined. The generated noise train was reconstructed in terms of a sample-correlated function and the frequency spectrum. (orig.)
DJpsiFDC:an event generator for the process gg→J/ψJ/ψat the LHC
QIAO Cong-Feng; WANG Jian; ZHENG Yang-Heng
2011-01-01
DJpsiFDC is an event generator package for the process gg→J/ψJ/ψ. It generates events for primary leading-order 2 → 2 processes. The package could generate a Les Houches Event (LHE) document and this could easily be embedded into detector simulation software frameworks. The package is produced in Fortran code.
HELAC-Onia 2.0: an upgraded matrix-element and event generator for heavy quarkonium physics
Shao, Hua-Sheng
2015-01-01
We present an upgraded version (denoted as version 2.0) of the program HELAC-Onia for the automated computation of heavy-quarkonium helicity amplitudes within non-relativistic QCD framework. The new code has been designed to include many new and useful features for practical phenomenological simulations. It is designed for job submissions under cluster enviroment for parallel computations via Python scripts. We have interfaced HELAC-Onia to the parton shower Monte Carlo programs Pythia 8 and QEDPS to take into account the parton-shower effects. Moreover, the decay module guarantees that the program can perform the spin-entangled (cascade-)decay of heavy quarkonium after its generation. We have also implemented a reweighting method to automatically estimate the uncertainties from renormalization and/or factorization scales as well as parton-distribution functions to weighted or unweighted events. A futher update is the possiblity to generate one-dimensional or two-dimensional plots encoded in the analysis file...
HELAC-Onia 2.0: An upgraded matrix-element and event generator for heavy quarkonium physics
Shao, Hua-Sheng
2016-01-01
We present an upgraded version (denoted as version 2.0) of the program HELAC-ONIA for the automated computation of heavy-quarkonium helicity amplitudes within non-relativistic QCD framework. The new code has been designed to include many new and useful features for practical phenomenological simulations. It is designed for job submissions under cluster environment for parallel computations via PYTHON scripts. We have interfaced HELAC-ONIA to the parton shower Monte Carlo programs PYTHIA 8 and QEDPS to take into account the parton-shower effects. Moreover, the decay module guarantees that the program can perform the spin-entangled (cascade-)decay of heavy quarkonium after its generation. We have also implemented a reweighting method to automatically estimate the uncertainties from renormalization and/or factorization scales as well as parton-distribution functions to weighted or unweighted events. A further update is the possibility to generate one-dimensional or two-dimensional plots encoded in the analysis files on the fly. Some dedicated examples are given at the end of the writeup.
The tt-bar production using MC events with one charged lepton (electron), neutrino and jets from pp-bar collisions at a center-of-mass energy of 1.96 TeV was investigated. The aim of this work was to compare the rate of events in central (|η|<1.1) and plug (1.1<|η|<2.8) region. (author)
Monte Carlo simulation of single and two-dosimeter approaches in a steam generator channel head.
Kim, C H; Reece, W D
2002-08-01
In a steam generator channel head, it was not unusual to see radiation workers wearing as many as twelve dosimeters over the surface of the body to avoid a possible underestimation of effective dose equivalent (H(E)) or effective dose (E). This study shows that only one or two dosimeters can be used to estimate H(E) and E without a significant underestimation. MCNP and a point-kernel approach were used to model various exposure situations in a steam generator channel head. The single-dosimeter approach (on the chest) was found to underestimate H(E) and E significantly for a few exposure situations, i.e., when the major portion of radiation source is located in the backside of a radiation worker. In this case, the photons from the source pass through the body and are attenuated before reaching the dosimeter on the chest. To assure that a single dosimeter provides a good estimate of worker dose, these few exposure situations cannot dominate a worker's exposure. On the other hand, the two-dosimeter approach (on the chest and back) predicts H(E) and E very well, hardly ever underestimating these quantities by more than 4% considering all worker positions and contamination situations in a steam generator channel head. This study shows that two dosimeters are adequate for an accurate estimation of H(E) and E in a steam generator channel head. PMID:12132712
Allowing Monte Carlo (MC) codes to perform fuel cycle calculations requires coupling to a point depletion solver. In order to perform depletion calculations, one-group (1-g) cross sections must be provided in advance. This paper focuses on generating accurate 1-g cross section values that are necessary for evaluation of nuclide densities as a function of burnup. The proposed method is an alternative to the conventional direct reaction rate tally approach, which requires extensive computational efforts. The method presented here is based on the multi-group (MG) approach, in which pre-generated MG sets are collapsed with MC calculated flux. In our previous studies, we showed that generating accurate 1-g cross sections requires their tabulation against the background cross-section (σ0) to account for the self-shielding effect. However, in previous studies, the model that was used to calculate σ0 was simplified by fixing Bell and Dancoff factors. This work demonstrates that 1-g values calculated under the previous simplified model may not agree with the tallied values. Therefore, the original background cross section model was extended by implicitly accounting for the Dancoff and bell factors. The method developed here reconstructs the correct value of σ0 by utilizing statistical data generated within the MC transport calculation by default. The proposed method was implemented into BGCore code system. The 1-g cross section values generated by BGCore were compared with those tallied directly from the MCNP code. Very good agreement (<0.05%) in the 1-g cross values was observed. The method dose not carry any additional computational burden and it is universally applicable to the analysis of thermal as well as fast reactor systems. (author)
Validation of proton ionization cross section generators for Monte Carlo particle transport
Batic, Matej; Saracco, Paolo
2011-01-01
Three software systems, ERCS08, ISICS 2011 and \\v{S}mit's code, that implement theoretical calculations of inner shell ionization cross sections by proton impact, are validated with respect to experimental data. The accuracy of the cross sections they generate is quantitatively estimated and inter-compared through statistical methods. Updates and extensions of a cross section data library relevant to PIXE simulation with Geant4 are discussed.
Monte Carlo Few-Group Constant Generation for CANDU 6 Core Analysis
Seung Yeol Yoo; Hyung Jin Shim; Chang Hyo Kim
2015-01-01
The current neutronics design methodology of CANDU-PHWRs based on the two-step calculations requires determining not only homogenized two-group constants for ordinary fuel bundle lattice cells by the WIMS-AECL lattice cell code but also incremental two-group constants arising from the penetration of control devices into the fuel bundle cells by a supercell analysis code like MULTICELL or DRAGON. As an alternative way to generate the two-group constants necessary for the CANDU-PHWR core analys...
An explosive detection system based on a Deuterium–Deuterium (D–D) neutron generator has been simulated using the Monte Carlo N-Particle Transport Code (MCNP5). Nuclear-based explosive detection methods can detect explosives by identifying their elemental components, especially nitrogen. Thermal neutron capture reactions have been used for detecting prompt gamma emission (10.82 MeV) following radiative neutron capture by 14N nuclei. The explosive detection system was built based on a fully high-voltage-shielded, axial D–D neutron generator with a radio frequency (RF) driven ion source and nominal yield of about 1010 fast neutrons per second (E=2.5 MeV). Polyethylene and paraffin were used as moderators with borated polyethylene and lead as neutron and gamma ray shielding, respectively. The shape and the thickness of the moderators and shields are optimized to produce the highest thermal neutron flux at the position of the explosive and the minimum total dose at the outer surfaces of the explosive detection system walls. In addition, simulation of the response functions of NaI, BGO, and LaBr3-based γ-ray detectors to different explosives is described. - Highlights: • Explosive detection system based on Deuterium–Deuterium neutron generator has been designed. • Shielding for a D–D neutron generator has been designed using MCNP code. • The special shield must be designed for each detector and neutron source. • Thermal neutron capture reactions have been used for detecting 10.82 MeV line from 14N nuclei. • Simulation of the response functions of NaI, BGO, and LaBr3 detectors
Following Wilson's suggestion of electron acceleration by the electric fields in thunderclouds, a number of experiments were attempted to investigate whether or not energetic electrons and bremsstrahlung X-rays were generated by thunderstorm electric fields or lightning discharge processes. In recent years, enhanced radiation at high altitude has been detected in experiments using scintillation detectors on a jet and an artificial satellite, demonstrating that radiation is indeed associated with lightning activities. However there are few experimental reports of detection near the ground since Whitmire's investigation using thermoluminescent dosimeters (TLDs) in 1979. In winter, many thunderstorms occur on the west coast of Japan, and it has been suggested that gamma-ray dose may increase occasionally during winter thunderstorms. Recently, a gamma-ray dose enhancement which might be caused by the lightning activity was measured by TLDs and environmental radiation monitors around the site of the fast breeder reactor 'Monju', a nuclear power plant facing the Japan Sea. (author)
Assessing hail risk for a building portfolio by generating stochastic events
Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie
2015-04-01
Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then
Highlights: •Monte Carlo modelling of the thermonuclear fusion. •Energy–angle distributions of the d–t and d–d fusion products calculated by the use of SIMNRA code. •Spatial distributions of the fusion products in the chamber of the IGN-14 obtained by use of MC code. -- Abstract: A fast neutron generator with a tritium target can be used to generate d–d and d–t reaction products corresponding to thermonuclear reactions in tokamaks or stellarators. In this way, convenient laboratory conditions for tests of spectrometric detectors – prior to their installation at the big fusion devices – can be achieved. Distributions of the alpha particles, protons, deuterons, and tritons generated by the fast neutron generator operating at the Institute of Nuclear Physics PAN in Cracow, Poland, were calculated by means of the Monte Carlo (MC) codes. Results of this MC modelling are presented
NiMax: a new approach to develop hadronic event generators in HEP
The NiMax framework is a new approach to develop, assemble and use hadronic event generators in HEP. There are several important concepts of the NiMax architecture: the components, the data file, the application domain module, the control system and the project. Here we describe these concepts stressing their functionality
TEMITOPE RAPHAEL AYODELE
2016-04-01
Full Text Available Monte Carlo simulation using Simple Random Sampling (SRS technique is popularly known for its ability to handle complex uncertainty problems. However, to produce a reasonable result, it requires huge sample size. This makes it to be computationally expensive, time consuming and unfit for online power system applications. In this article, the performance of Latin Hypercube Sampling (LHS technique is explored and compared with SRS in term of accuracy, robustness and speed for small signal stability application in a wind generator-connected power system. The analysis is performed using probabilistic techniques via eigenvalue analysis on two standard networks (Single Machine Infinite Bus and IEEE 16–machine 68 bus test system. The accuracy of the two sampling techniques is determined by comparing their different sample sizes with the IDEAL (conventional. The robustness is determined based on a significant variance reduction when the experiment is repeated 100 times with different sample sizes using the two sampling techniques in turn. Some of the results show that sample sizes generated from LHS for small signal stability application produces the same result as that of the IDEAL values starting from 100 sample size. This shows that about 100 sample size of random variable generated using LHS method is good enough to produce reasonable results for practical purpose in small signal stability application. It is also revealed that LHS has the least variance when the experiment is repeated 100 times compared to SRS techniques. This signifies the robustness of LHS over that of SRS techniques. 100 sample size of LHS produces the same result as that of the conventional method consisting of 50000 sample size. The reduced sample size required by LHS gives it computational speed advantage (about six times over the conventional method.
AUTHOR|(SzGeCERN)676067
The start of the Large Hadron Collider provides an unprecedent opportunity for the exploration of physics at the \\TeV{} scale. It is expected to perform precise tests of the structure of the Standard Model and to hint at the structure of the physical laws at a more fundamental level. \\paragraph{} The first part of this work describes a tune of the initial- and final-state radiation parameters in the \\textsc{Pythia8} Monte Carlo generator, using ATLAS measurements of \\ttbar{} production at $\\sqrt{s}=7$ \\TeV{}. The results are compared to previous tunes to the $Z$ boson transverse momentum at the LHC, and to the LEP event shapes in $Z$ boson hadronic decays, testing of the universality of the parton shower model. The tune of Pythia8 to the \\ttbar{} measurements is applied to the next-to-leading order generators MadGraph5\\_aMC@NLO and Powheg, and additional parameters of these generators are tuned to the \\ttbar{} data. For the first time in the context of Monte Carlo tuning, the correlation of the experimental ...
De Beer, R.; Van Ormondt, D.
2014-01-01
Work in context of European Union TRANSACT project. We have developed a Java/JNI/C/Fortran based software application, called MonteCarlo, with which the users can carry out Monte Carlo studies in the field of \\emph{in vivo} MRS. The application is supposed to be used as a tool for supporting the \\e
Cooldown strategies for a steam generator tube rupture event with failure of main steam safety valve
This paper provides an evaluation of the thermal-hydraulic response of a pressurized water reactor (PWR) during a steam generator tube rupture (SGTR) event with the failure of a main steam safety valve (MSSV). Operator actions to successfully mitigate the consequences of this SGTR event are proposed. The desired actions are those which provide for control of the affected steam generator water level and minimize radiological doses to the environment. Specifically, the purpose of this paper is to demonstrate the results of differences in operator actions to cooldown the power plant in terms of: (1) dose releases to the environment, (2) control of the affected steam generator level, (3) and optimal reactor coolant system cooldown and depressurization
Recent developments in neutron generator technology suggest that compact instruments with high neutron yield can be used for NAA and PGNAA in combination with high count rate spectrometers. For laboratories far away from Research Reactors (RRs), such devices could serve as an alternative for training students in radioanalytical and nuclear Chemistry and certain specialized applications. As Neutron activation analysis is a well established technique with a long history of documented applications it could be made available to countries where no research reactors or other neutron irradiation facilities exist by using the proposed approach. Prompt gamma neutron activation analysis (PGNAA) is a versatile analytical tool with many applications unique to the technique. As PGNAA is generally performed at RRs external neutron guides with relatively low N flux, the proposed instrument has a potential to supplement existing PGNAA facilities far away from RRs. Neutron generators, particularly the DD-NGs, are a cost effective, easy to operate and particularly safe alternative to other neutron sources, e.g. isotopic neutron sources like Cf-252 or Am/Be. The idea to combine new developments in DD-NG with moderator/shielding and detectors for fast gamma counting emerged from a recent IAEA Coordinated Research Project (CRP) on New Developments in PGNAA, and an IAEA technical meeting on Neutron Generators for Activation Analysis Purposesis currently under preparation. We report on the design and optimization of a Neutron Activation Analysis (NAA) and a Prompt Gamma Neutron Activation Analysis (PGNAA) chamber associated with a D-D neutron generator. The nominal yield of the generator is about 1010 fast neutrons per seconds (E=2.5MeV). MCNP-Monte Carlo N-Particle Transport simulation code and analytical equation, are used to optimize the setup with respect to thermal flux and radiation protection. Many moderators such as Graphite (G), Polyethylene (Poly), Heavy water (HW), Light water
Wiącek, U.; Dankowski, J.
2015-04-01
A fast neutron generator with a tritium target can be used to generate d-d and d-t reaction products corresponding to thermonuclear reactions in tokamaks or stellarators. In this way, convenient laboratory conditions for tests of spectrometric detectors - prior to their installation at the big fusion devices - can be achieved. Distributions of the alpha particles, protons, deuterons, and tritons generated by the fast neutron generator operating at the Institute of Nuclear Physics PAN in Cracow, Poland, were calculated by means of the Monte Carlo (MC) codes. Results of this MC modelling are presented.
Life Cycle Assessment (LCA) is a rather common tool for reducing environmental impacts while striving for cleaner processes. This method yields reliable information when input data is sufficient; however, in uncertain systems Monte Carlo (MC) simulation is used as a means to compensate for insufficient data. The MC optimization model was constructed from environmental emissions, process parameters and operation constraints. The results of MC optimization allow for the prediction of environmental performance and the opportunity for environmental improvement. The case study presented here focuses on the acidification improvement regarding uncertain emissions and on the available operation of Taiwan's power plants. The boundary definitions of LCA were established for generation, fuel refining and mining. The model was constructed according to objective functional minimization of acidification potential, base loading, fuel cost and generation mix constraints. Scenario simulations are given the different variation of fuel cost ratios for Taiwan. The simulation results indicate that fuel cost was the most important parameter influencing the acidification potential for seven types of fired power. Owing to the low operational loading, coal-fired power is the best alternative for improving acidification. The optimal scenario for acidification improvement occurred at 15% of the fuel cost. The impact decreased from 1.39 to 1.24 kg SO2-eq./MWh. This reduction benefit was about 10.5% lower than the reference year. Regarding eco-efficiency at an optimum scenario level of 5%, the eco-efficiency value was - 12.4 $US/kg SO2-eq. Considering the environmental and economical impacts, results indicated that the ratio of coal-fired steam turbine should be reduced. (author)
Lampos, Vasileios
2012-01-01
A vast amount of textual web streams is influenced by events or phenomena emerging in the real world. The social web forms an excellent modern paradigm, where unstructured user generated content is published on a regular basis and in most occasions is freely distributed. The present Ph.D. Thesis deals with the problem of inferring information - or patterns in general - about events emerging in real life based on the contents of this textual stream. We show that it is possible to extract valua...
Event generator for RHIC spin physics. Proceedings of RIKEN BNL Research Center workshop: Volume 11
A major objective of the workshop was to establish a firm collaboration to develop suitable event generators for the spin physics program at RHIC. With the completion of the Relativistic Heavy Ion Collider (RHIC) as a polarized collider a completely new domain of high-energy spin physics will be opened. The planned studies address the spin structure of the nucleon, tests of the standard model, and transverse spin effects in initial and final states. RHIC offers the unique opportunity to pursue these studies because of its high and variable energy, 50 ≤ √s ≤ 500 GeV, high polarization, 70%, and high luminosity, 2 x 1032 cm-2 sec-1 or more at 500 GeV. To maximize the output from the spin program at RHIC, the understanding of both experimental and theoretical systematic errors is crucial. It will require full-fledged event generators, to simulate the processes of interest in great detail. The history of event generators shows that their development and improvement are ongoing processes taking place in parallel to the physics analysis by various experimental groups. The number of processes included in the generators has been increasing and the precision of their predictions is being improved continuously. This workshop aims at getting this process well under way for the spin physics program at RHIC, based on the first development in this direction, SPHINX
HIGH QUALITY IMPLEMENTATION FOR AUTOMATIC GENERATION C# CODE BY EVENT-B PATTERN
Eman K Elsayed
2014-01-01
Full Text Available In this paper we proposed the logical correct path to implement automatically any algorithm or model in verified C# code. Our proposal depends on using the event-B as a formal method. It is suitable solution for un-experience in programming language and profession in mathematical modeling. Our proposal also integrates requirements, codes and verification in system development life cycle. We suggest also using event-B pattern. Our suggestion is classify into two cases, the algorithm case and the model case. The benefits of our proposal are reducing the prove effort, reusability, increasing the automation degree and generate high quality code. In this paper we applied and discussed the three phases of automatic code generation philosophy on two case studies the first is “minimum algorithm” and the second one is a model for ATM.
Study on the regulatory approach of KNGR multiple steam generator tube rupture events
Chang, Keun Sun; Kweon, Y. C.; Lee, S. J.; Lee, Y. S.; Cheong, D. Y.; Park, T. J.; Lee, M. G.; Cheon, Y. H. [Sunmoon Univ., Asan (Korea, Republic of); Cheong, J. H. [Baekseok College of Cultural Studies, Cheonan (Korea, Republic of)
2001-10-15
The scope and contents performed in this project are as follows : firstly, reviews of the structure and contents of local and foreign regulatory requirements as well as analysis of design features related to safety improvement and containment bypass during multiple steam generator tube failure of advanced reactors of domestic and foreign countries. Secondly, analyses of the state-of-the-art of the development of local and foreign regulatory requirements, research trends, design features and safety goals of advanced reactors, especially for technical issues related to the containment bypass during MSGTR event. Thirdly, analyses of the event of MSGTR for the KNGR using MAS 1.4 which is the best-estimate system code developed by Korea Atomic Energy Research Institute. Errors in input-decks established last year have been corrected during this analysis. Fourthly, assessment of the effects of several parameters on the consequences following a MSGTR event. Tube rupture location, selection of affected steam generator, tube modeling method, discharge coefficient (C{sub D}) are examined. Fifthly, establishment of regulatory direction of technical issues related to the containment bypass during MSGTR event.
On the E-W asymmetry and the generation of ESP events
Observations of energetic-ion intensity enhancements (E <= 290 keV) associated with solar flare generated shock waves (solar flare ESP events), obtained during nearly a decade by the APL/JHU instruments on board the Earth orbiters IMP-7 and 8, are incorporated in this work in order to examine the role of the heliolongitude dependend large scale shock morphology with relation to the upstream interplanetary magnetic field in the formation of these ESP events. It is shown that a clear east-west solar hemisphere asymmetry is present in the distribution of the ESP relative intensity enhancements with respect to the heliolongitudes of the shock wave source-flare sites. The large ion-intensity enhancements super-imposed on the ambient solar flare ion population are preferentially associated with solar flare sites located to the east of the spacecraft meridian, whereas on the average only weak ESP events are associated with solar flare sites to the west of the spacecraft meridian. The observed asymmetry and its implications on the dominant processes for the generation of the solar flare ESP events are discussed on the basis of the presented extensive survey. (orig.)
Material control study: a directed graph and fault tree procedure for adversary event set generation
In work for the United States Nuclear Regulatory Commission, Lawrence Livermore Laboratory is developing an assessment procedure to evaluate the effectiveness of a potential nuclear facility licensee's material control (MC) system. The purpose of an MC system is to prevent the theft of special nuclear material such as plutonium and highly enriched uranium. The key in the assessment procedure is the generation and analysis of the adversary event sets by a directed graph and fault-tree methodology
Catfish: A Monte Carlo simulator for black holes at the LHC
Cavaglià, M; Cremaldi, L; Summers, D
2006-01-01
We present a new Fortran Monte Carlo generator to simulate black hole events at CERN's Large Hadron Collider. The generator interfaces to the PYTHIA Monte Carlo fragmentation code. The physics of the BH generator includes, but not limited to, inelasticity effects, exact field emissivities, corrections to semiclassical black hole evaporation and gravitational energy loss at formation. These features are essential to realistically reconstruct the detector response and test different models of black hole formation and decay at the LHC.
Catfish: A Monte Carlo simulator for black holes at the LHC
Cavaglià, M.; Godang, R.; Cremaldi, L.; Summers, D.
2007-09-01
We present a new Fortran Monte Carlo generator to simulate black hole events at CERN's Large Hadron Collider. The generator interfaces to the PYTHIA Monte Carlo fragmentation code. The physics of the BH generator includes, but not limited to, inelasticity effects, exact field emissivities, corrections to semiclassical black hole evaporation and gravitational energy loss at formation. These features are essential to realistically reconstruct the detector response and test different models of black hole formation and decay at the LHC.
EVENT GENERATION OF STANDARD MODEL HIGGS DECAY TO DIMUON PAIRS USING PYTHIA SOFTWARE
Yusof, Adib
2015-01-01
My project for CERN Summer Student Programme 2015 is on Event Generation of Standard Model Higgs Decay to Dimuon Pairs using Pythia Software. Briefly, Pythia or specifically, Pythia 8.1 is a program for the generation of high-energy Physics events that is able to describe the collisions at any given energies between elementary particles such as Electron, Positron, Proton as well as anti-Proton. It contains theory and models for a number of Physics aspects, including hard and soft interactions, parton distributions, initial-state and final-state parton showers, multiparton interactions, fragmentation and decay. All programming code is to be written in C++ language for this version (the previous version uses FORTRAN) and can be linked to ROOT software for displaying output in form of histogram. For my project, I need to generate events for standard model Higgs Boson into Muon and anti-Muon pairs (H→μ+ μ) to study the expected significance value for this particular process at centre-of-mass energy of 13 TeV...
Igor V. Karyakin
2016-02-01
Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.
BEEC: An event generator for simulating the Bc meson production at an e+e- collider
Yang, Zhi; Wu, Xing-Gang; Wang, Xian-You
2013-12-01
The Bc meson is a doubly heavy quark-antiquark bound state and carries flavors explicitly, which provides a fruitful laboratory for testing potential models and understanding the weak decay mechanisms for heavy flavors. In view of the prospects in Bc physics at the hadronic colliders such as Tevatron and LHC, Bc physics is attracting more and more attention. It has been shown that a high luminosity e+e- collider running around the Z0-peak is also helpful for studying the properties of Bc meson and has its own advantages. For this purpose, we write down an event generator for simulating Bc meson production through e+e- annihilation according to relevant publications. We name it BEEC, in which the color-singlet S-wave and P-wave (cb¯)-quarkonium states together with the color-octet S-wave (cb¯)-quarkonium states can be generated. BEEC can also be adopted to generate the similar charmonium and bottomonium states via the semi-exclusive channels e++e-→|(QQ¯)[n]>+Q+Q¯ with Q=b and c respectively. To increase the simulation efficiency, we simplify the amplitude as compact as possible by using the improved trace technology. BEEC is a Fortran program written in a PYTHIA-compatible format and is written in a modular structure, one may apply it to various situations or experimental environments conveniently by using the GNU C compiler make. A method to improve the efficiency of generating unweighted events within PYTHIA environment is proposed. Moreover, BEEC will generate a standard Les Houches Event data file that contains useful information of the meson and its accompanying partons, which can be conveniently imported into PYTHIA to do further hadronization and decay simulation. Catalogue identifier: AEQC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQC_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in
Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing
Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)
2014-07-01
Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing
The brightest events with enhancements of the intensity of the soft component of secondary cosmic rays observed during thunderstorms in the Baksan Valley are analyzed. These experimental data were obtained during thunderstorm seasons of 2003-2008. Assuming bremsstrahlung photons from cascades of runaway electrons to be the main source of the enhancements, the height of generation level is estimated for every event. It is shown that for a half of all events the region of particle generation is located in the stratosphere.
Monte Carlo applications to radiation shielding problems
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation
For the investigation of the behaviour of distributed parameter systems, which can be described by second order partial differential equations, often the complete solution profiles are not interesting but only some points in the space-time domain. For this, the Monte Carlo method presents itself as a solution algorithm. Starting from the definitions and properties of the Monte Carlo method, it is investigated in this paper under which conditions this solution statement can be applied advantageously by means of a hybrid computer. It is shown by examples, that high requirements have to be put on the properties of the used random generator. This and the high calculation time, necessasry for the solution, limit the applicability of the method presented, even for modern computer systems. (orig.)
Monte Carlo simulation of virtual compton scattering at MAMI
The Monte Carlo simulation developed specially for the VCS experiments taking place at MAMI in fully described. This simulation can generate events according to the Bethe-Heitler + Born cross section behaviour and takes into account resolution deteriorating effects. It is used to determine solid angles for the various experimental settings. (authors)
Risk-based generation dispatch in the power grid for resilience against extreme weather events
Javanbakht, Pirooz
Natural disasters have been considered as one of the main causes of the largest blackouts in North America. When it comes to power grid resiliency against natural hazards, different solutions exist that are mainly categorized based on the time-frame of analysis. At the design stage, robustness and resiliency may be improved through redundant designs and inclusion of advanced measurement, monitoring, control and protection systems. However, since massive destructive energy may be released during the course of a natural disaster (such as a hurricane) causing large-scale and widespread disturbances, design-stage remedies may not be sufficient for ensuring power grid robustness. As a result, to limit the consequent impacts on the operation of the power grid, the system operator may be forced to take immediate remedial actions in real-time. To effectively manage the disturbances caused by severe weather events, weather forecast information should be incorporated into the operational model of the power grid in order to predict imminent contingencies. In this work, a weather-driven generation dispatch model is developed based on stochastic programming to provide a proactive solution for power grid resiliency against imminent large-scale disturbances. Hurricanes and ice storms are studied as example disaster events to provide numerical results. In this approach, the statistics of the natural disaster event are taken into account along with the expected impact on various power grid components in order to determine the availability of the grid. Then, a generation dispatch strategy is devised that helps operate the grid subject to weather-driven operational constraints.
A Monte Carlo method of multiple scattered coherent light with the information of shear wave propagation in scattering media is presented. The established Monte-Carlo algorithm is mainly relative to optical phase variations due to the acoustic-radiation-force shear-wave-induced displacements of light scatterers. Both the distributions and temporal behaviors of optical phase increments in probe locations are obtained. Consequently, shear wave speed is evaluated quantitatively. It is noted that the phase increments exactly track the propagations of shear waves induced by focus-ultrasound radiation force. In addition, attenuations of shear waves are demonstrated in simulation results. By using linear regression processing, the shear wave speed, which is set to 2.1 m/s in simulation, is estimated to be 2.18 m/s and 2.35 m/s at time sampling intervals of 0.2 ms and 0.5 ms, respectively
Up to 2009, the author and a colleague conducted trend analyses of problem events related to main generators, emergency diesel generators, breakers, motors and transformers which are more likely to cause problems than other electric components in nuclear power plants. Among the electric components with high frequency of defect occurrence, i.e., emergency diesel generators, several years have passed since the last analyses. These are very important components needed to stop a nuclear reactor safely and to cool it down during external power supply loses. Then trend analyses were conducted for the second time. The trend analyses were performed on 80 problem events with emergency diesel generators which had occurred in U.S. nuclear power plants in the five years from 2005 through 2009 among events reported in the Licensee Event Reports (LERs: event reports submitted to NRC by U.S. nuclear power plants) which have been registered in the nuclear information database of the Institute of Nuclear Safety System, Inc. (INSS) , as well as 40 events registered in the Nuclear Information Archives (NUCIA), which occurred in Japanese nuclear power plants in the same time period. It was learned from the trend analyses of the problem events with emergency diesel generators that frequency of defect occurrence are high in both Japanese and US plants during plant operations and functional tests (that is, defects can be discovered effectively in advance), so that implementation of periodical functional tests under plant operation is an important task for the future. (author)
Generation of whistler mode emissions in the inner magnetosphere: An event study
Schriver, D.; Ashour-Abdalla, M.; Coroniti, F. V.; LeBoeuf, J. N.; Decyk, V.; Travnicek, P.; Santolík, O.; Winningham, D.; Pickett, J. S.; Goldstein, M. L.; Fazakerley, A. N.
2010-08-01
On July 24, 2003, when the Cluster 4 satellite crossed the magnetic equator at about 4.5 RE radial distance on the dusk side (˜15 MLT), whistler wave emissions were observed below the local electron gyrofrequency (fce) in two bands, one band above one-half the gyrofrequency (0.5fce) and the other band below 0.5fce. A careful analysis of the wave emissions for this event has shown that Cluster 4 passed through the wave source region. Simultaneous electron particle data from the PEACE instrument in the generation region indicated the presence of a mid-energy electron population (˜100 s of eV) that had a highly anisotropic temperature distribution with the perpendicular temperature 10 times the parallel temperature. To understand this somewhat rare event in which the satellite passed directly through the wave generation region and in which a free energy source (i.e., temperature anisotropy) was readily identified, a linear theory and particle in cell simulation study has been carried out to elucidate the physics of the wave generation, wave-particle interactions, and energy redistribution. The theoretical results show that for this event the anisotropic electron distribution can linearly excite obliquely propagating whistler mode waves in the upper frequency band, i.e., above 0.5fce. Simulation results show that in addition to the upper band emissions, nonlinear wave-wave coupling excites waves in the lower frequency band, i.e., below 0.5fce. The instability saturates primarily by a decrease in the temperature anisotropy of the mid-energy electrons, but also by heating of the cold electron population. The resulting wave-particle interactions lead to the formation of a high-energy plateau on the parallel component of the warm electron velocity distribution. The theoretical results for the saturation time scale indicate that the observed anisotropic electron distribution must be refreshed in less than 0.1 s allowing the anisotropy to be detected by the electron
Tuning and validation of hadronic event generator for $R$ value measurements in the tau-charm region
Ping, Rong-Gang; Xia, Lei; Gao, Zhen; Li, Ying-Tian; Zhou, Xing-Yu; Zhang, Bing-Xin; Yan, Bo Zheng Wen-Biao; Hu, Hai-Ming; Huang, Guang-Shun
2016-01-01
To measure the $R$ value in an energy scan experiment with $\\ee$ collisions, precise calculation of initial state radiation is required in event generators. We present an event generator for this consideration, which incorporates initial state radiation effects up to the second order accuracy, and the radiative correction factor is calculated using the totally hadronic Born cross section. The measured exclusive processes are generated according to their cross sections, while the unknown processes are generated using the LUND Area Law model, and its parameters are tuned with data collected at $\\sqrt s=3.65$ GeV. The optimized values are validated with data in the range $\\sqrt s=2.2324\\sim3.671$ GeV. These optimized parameters are universally valid for event generation below the $D\\bar D$ threshold.
A. J. Gerrard
2011-05-01
Full Text Available Observations of in-situ generated atmospheric gravity waves associated with a stratospheric temperature enhancement (STE are presented. Two sets of gravity waves are observed by molecular-aerosol lidar in conjunction with the early December 2000 STE event above Sondrestrom, Greenland. The first set of gravity waves shows downward phase progression with a vertical wavelength of ~8 km while the second set shows upward phase progression with a vertical wavelength of ~9 km. With estimates of the background wind fields from synoptic analyses, the various intrinsic gravity wave parameters of these two wave structures are found. The observed waves compare well to numerical modeling predictions, though the potential observation of a downward propagating wave would be unexpected.
A study on thermodynamical properties of hot and dense hadron gas using the event generator
Sasaki, N
2001-01-01
We investigate the equilibration and the equation of state of the hot hadron gas at finite baryon density using an event generator that satisfies detailed balance at temperatures and baryon densities of present interests (80 < T < 170 MeV, 0.157 < n_B < 0.315 fm^-3). Molecular-dynamic-simulations are performed to the system of hadrons in the box with periodic boundary conditions. Starting from an initial condition composed of nucleons with uniform momentum distribution, the evolution takes place through interactions, productions and absorptions. The system approaches to a stationary state of baryons, mesons and their resonances. The system is characterized by an exponent in the energy distribution irrespective of the particle species, i.e., temperature. After the equilibration, thermodynamical quantities such as energy density, particle density, entropy and pressure are calculated. Obtained equation of state shows a remarkable deviation from the mixed free gas of mesons and baryons above T = m_pi....
Possible Improvements to MCNP6 and its CEM/LAQGSM Event-Generators
Mashnik, Stepan Georgievich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-08-04
This report is intended to the MCNP6 developers and sponsors of MCNP6. It presents a set of suggested possible future improvements to MCNP6 and to its CEM03.03 and LAQGSM03.03 event-generators. A few suggested modifications of MCNP6 are quite simple, aimed at avoiding possible problems with running MCNP6 on various computers, i.e., these changes are not expected to change or improve any results, but should make the use of MCNP6 easier; such changes are expected to require limited man-power resources. On the other hand, several other suggested improvements require a serious further development of nuclear reaction models, are expected to improve significantly the predictive power of MCNP6 for a number of nuclear reactions; but, such developments require several years of work by real experts on nuclear reactions.
GenEvA (I): a new framework for event generation
We show how many contemporary issues in event generation can be recast in terms of partonic calculations with a matching scale. This framework is called GenEvA, and a key ingredient is a new notion of phase space which avoids the problem of phase space double-counting by construction and includes a built-in definition of a matching scale. This matching scale can be used to smoothly merge any partonic calculation with a parton shower. The best partonic calculation for a given region of phase space can be determined through physics considerations alone, independent of the algorithmic details of the merging. As an explicit example, we construct a positive-weight partonic calculation for e+e- → n jets at next-to-leading order (NLO) with leading-logarithmic (LL) resummation. We improve on the NLO/LL result by adding additional higher-multiplicity tree-level (LO) calculations to obtain a merged NLO/LO/LL result. These results are implemented using a new phase space generator introduced in a companion paper.
Comparative Analyses on OPR1000 Steam Generator Tube Rupture Event Emergency Operational Guideline
Lee, Sang Won; Bae, Yeon Kyoung; Kim, Hyeong Teak [Korea Hydro and Nuclear Power Co., Ltd., Taejon (Korea, Republic of)
2006-07-01
The Steam Generator Tube Rupture (SGTR) event is one of the important scenarios in respect to the radiation release to the environment. When the SGTR occurs, containment integrity is not effective because of the direct bypass of containment via the ruptured steam generator to the MSSV and MSADV. To prevent this path, the Emergency Operational Guideline of OPR1000 indicates the use of Turbine Bypass Valves (TBVs) as an effective means to depressurize the main steam line and prevent the lifting of MSSV. However, the TBVs are not operable when the offsite power is not available (LOOP). In this situation, the RCS cool-down is achieved by opening the both intact and ruptured SG MSADV. But this action causes the large amount of radiation release to the environment. To minimize the radiation release to the environment, KSNP EOG adopts the improved strategy when the SGTR concurrently with LOOP is occurred. However, these procedures show some duplicated procedure and branch line that might confusing the operator for optimal recovery action. So, in this paper, the comparative analysis on SGTR and SGTR with LOOP is performed and optimized procedure is proposed.
Comparative Analyses on OPR1000 Steam Generator Tube Rupture Event Emergency Operational Guideline
The Steam Generator Tube Rupture (SGTR) event is one of the important scenarios in respect to the radiation release to the environment. When the SGTR occurs, containment integrity is not effective because of the direct bypass of containment via the ruptured steam generator to the MSSV and MSADV. To prevent this path, the Emergency Operational Guideline of OPR1000 indicates the use of Turbine Bypass Valves (TBVs) as an effective means to depressurize the main steam line and prevent the lifting of MSSV. However, the TBVs are not operable when the offsite power is not available (LOOP). In this situation, the RCS cool-down is achieved by opening the both intact and ruptured SG MSADV. But this action causes the large amount of radiation release to the environment. To minimize the radiation release to the environment, KSNP EOG adopts the improved strategy when the SGTR concurrently with LOOP is occurred. However, these procedures show some duplicated procedure and branch line that might confusing the operator for optimal recovery action. So, in this paper, the comparative analysis on SGTR and SGTR with LOOP is performed and optimized procedure is proposed
BOT3P consists of a set of standard Fortran 77 language programs that gives the users of the deterministic transport codes DORT, TORT, TWODANT, THREEDANT, PARTISN and the sensitivity code SUSD3D some useful diagnostic tools to prepare and check the geometry of their input data files for both Cartesian and cylindrical geometries, including graphical display modules. Users can produce the geometrical and material distribution data for all the cited codes for both two-dimensional and three-dimensional applications and, only in 3-dimensional Cartesian geometry, for the Monte Carlo Transport Code MCNP, starting from the same BOT3P input. Moreover, BOT3P stores the fine mesh arrays and the material zone map in a binary file, the content of which can be easily interfaced to any deterministic and Monte Carlo transport code. This makes it possible to compare directly for the same geometry the effects stemming from the use of different data libraries and solution approaches on transport analysis results. BOT3P Version 5.0 lets users optionally and with the desired precision compute the area/volume error of material zones with respect to the theoretical values, if any, because of the stair-cased representation of the geometry, and automatically update material densities on the whole zone domains to conserve masses. A local (per mesh) density correction approach is also available. BOT3P is designed to run on Linux/UNIX platforms and is publicly available from the Organization for Economic Cooperation and Development (OECD/NEA)/Nuclear Energy Agency Data Bank. Through the use of BOT3P, radiation transport problems with complex 3-dimensional geometrical structures can be modelled easily, as a relatively small amount of engineer-time is required and refinement is achieved by changing few parameters. This tool is useful for solving very large challenging problems, as successfully demonstrated not only in some complex neutron shielding and criticality benchmarks but also in a power
This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)
The Relationship among Negative Life Events, Cognitions, and Depression within Three Generations.
Nacoste, Denise R. Barnes; Wise, Erica H.
1991-01-01
Investigated extent to which cognitions mediate relationship between negative life events and depression. College students and their same-sex parents and grandparents (n=171) completed measures of stressful life events, automatic thoughts, dysfunctional attitudes, and depression. Found interaction between negative life events and cognition for…
Jouck, Toon; Depaire, Benoit
2014-01-01
Past research revealed issues with artificial event data used for comparative analysis of process mining algorithms. The aim of this research is to design, implement and validate a framework for producing artificial event logs which should increase discriminatory power of artificial event logs when evaluating process discovery techniques.
Monte Carlo Methods in Physics
Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained
Åström, Helena Lisa Alexandra; Sunyer Pinya, Maria Antonia; Madsen, H.;
2015-01-01
. Consequently, simultaneous occurrence of extreme water level and precipitation events is expected to increase in the future as a result of change in LCC frequencies. The RCM projections for LCC frequencies are uncertain because the representation of current LCCs is poor; a large number of days cannot......The aim of this study is to enhance the understanding of the occurrence of flood generating events in urban areas by analyzing the relationship between large-scale atmospheric circulation and extreme precipitation events, extreme sea water level events and their simultaneous occurrence......, respectively. To describe the atmospheric circulation we used the Lamb circulation type (LCT) classification and re-grouped it into Lamb circulation classes (LCC). The daily LCCs/LCTs were connected with rare precipitation and water level events in Aarhus, a Danish coastal city. Westerly and cyclonic LCCs (W...
Introduction to Monte Carlo methods: sampling techniques and random numbers
The Monte Carlo method describes a very broad area of science, in which many processes, physical systems and phenomena that are statistical in nature and are difficult to solve analytically are simulated by statistical methods employing random numbers. The general idea of Monte Carlo analysis is to create a model, which is similar as possible to the real physical system of interest, and to create interactions within that system based on known probabilities of occurrence, with random sampling of the probability density functions. As the number of individual events (called histories) is increased, the quality of the reported average behavior of the system improves, meaning that the statistical uncertainty decreases. Assuming that the behavior of physical system can be described by probability density functions, then the Monte Carlo simulation can proceed by sampling from these probability density functions, which necessitates a fast and effective way to generate random numbers uniformly distributed on the interval (0,1). Particles are generated within the source region and are transported by sampling from probability density functions through the scattering media until they are absorbed or escaped the volume of interest. The outcomes of these random samplings or trials, must be accumulated or tallied in an appropriate manner to produce the desired result, but the essential characteristic of Monte Carlo is the use of random sampling techniques to arrive at a solution of the physical problem. The major components of Monte Carlo methods for random sampling for a given event are described in the paper
An Event-Based Methodology to Generate Class Diagrams and its Empirical Evaluation
Sandeep K. Singh
2010-01-01
Full Text Available Problem statement: Event-based systems have importance in many application domains ranging from real time monitoring systems in production, logistics, medical devices and networking to complex event processing in finance and security. The increasing popularity of Event-based systems has opened new challenging issues for them. One such issue is to carry out requirements analysis of event-based systems and build conceptual models. Currently, Object Oriented Analysis (OOA using Unified Modeling Language (UML is the most popular requirement analysis approach for which several OOA tools and techniques have been proposed. But none of the techniques and tools to the best of our knowledge, have focused on event-based requirements analysis, rather all are behavior-based approaches. Approach: This study described a requirement analysis approach specifically for event based systems. The proposed approach started from events occurring in the system and derives an importable class diagram specification in XML Metadata Interchange (XMI format for Argo UML tool. Requirements of the problem domain are captured as events in restricted natural language using the proposed Event Templates in order to reduce the ambiguity. Results: Rules were designed to extract a domain model specification (analysis-level class diagram from Event Templates. A prototype tool 'EV-ClassGEN' is also developed to provide automation support to extract events from requirements, document the extracted events in Event Templates and implement rules to derive specification for an analysis-level class diagram. The proposed approach is also validated through a controlled experiment by applying it on many cases from different application domains like real time systems, business applications, gaming. Conclusion: Results of the controlled experiment had shown that after studying and applying Event-based approach, student's perception about ease of use and usefulness of OOA technique has
Event shape distributions at LEP
Taševský, Marek
New Jersey : World Scientific, 2007 - (Kuze, M.; Nagano, K.; Tokushuku, K.), s. 427-430 ISBN 978-981-256-871-7. [International Workshop on Deep Inelastic Scattering and QCD (DIS 2006) /14./. Tsukuba (JP), 20.04.2006-24.04.2006] R&D Projects: GA MŠk LC527 Institutional research plan: CEZ:AV0Z10100502 Keywords : LEP * event shapes * Monte Carlo generators Subject RIV: BF - Elementary Particles and High Energy Physics
Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.
In this paper we address the importance of including the consideration of revenue loss into the safety analysis as well as system optimisation and modify the traditional Life Cycle Cost (LCC) into Life Cycle Revenue Loss (LCRL) as the criterion of optimisation and a quantitative assessment of the consequence of un-wished ebents, such as system unavailability. Through the Monte Carlo simulation technique and a simple scenario of decision making in a bidding process, we demonstrate the feasibility of our new LCRL model
Purpose: The focus of this work was the demonstration and validation of VirtuaLinac with clinical photon beams and to investigate the implementation of low-Z targets in a TrueBeam linear accelerator (Linac) using Monte Carlo modeling. Methods: VirtuaLinac, a cloud based web application utilizing Geant4 Monte Carlo code, was used to model the Linac treatment head components. Particles were propagated through the lower portion of the treatment head using BEAMnrc. Dose distributions and spectral distributions were calculated using DOSXYZnrc and BEAMdp, respectively. For validation, 6 MV flattened and flattening filter free (FFF) photon beams were generated and compared to measurement for square fields, 10 and 40 cm wide and at dmax for diagonal profiles. Two low-Z targets were investigated: a 2.35 MeV carbon target and the proposed 2.50 MeV commercial imaging target for the TrueBeam platform. A 2.35 MeV carbon target was also simulated in a 2100EX Clinac using BEAMnrc. Contrast simulations were made by scoring the dose in the phosphor layer of an IDU20 aSi detector after propagating through a 4 or 20 cm thick phantom composed of water and ICRP bone. Results: Measured and modeled depth dose curves for 6 MV flattened and FFF beams agree within 1% for 98.3% of points at depths greater than 0.85 cm. Ninety three percent or greater of points analyzed for the diagonal profiles had a gamma value less than one for the criteria of 1.5 mm and 1.5%. The two low-Z target photon spectra produced in TrueBeam are harder than that from the carbon target in the Clinac. Percent dose at depth 10 cm is greater by 3.6% and 8.9%; the fraction of photons in the diagnostic energy range (25–150 keV) is lower by 10% and 28%; and contrasts are lower by factors of 1.1 and 1.4 (4 cm thick phantom) and 1.03 and 1.4 (20 cm thick phantom), for the TrueBeam 2.35 MV/carbon and commercial imaging beams, respectively. Conclusions: VirtuaLinac is a promising new tool for Monte Carlo modeling of novel
Top quark event modelling and generators in the CMS experiment at the LHC
Bilin, B
2016-01-01
State-of-the-art theoretical predictions accurate to next-to-leading order QCD interfaced with {\\sc pythia} and {\\sc herwig} are tested by comparing the unfolded $t\\bar{t}$ differential data collected with the CMS detector at 8 TeV and 13 TeV. These predictions are also compared with the measurements of underlying event activity distributions accompanying ${\\rm t\\bar{t}}$ events. Furthermore, predictions of beyond NLO accuracy in QCD are compared with the data.
Analysis of Mihama-2 steam generator tube rupture (SGTR) event (preliminary analysis)
Preliminary analyses were performed for the SGTR event which occurred at Mihama-2 (Westinghouse 2-loop PWR, 500 MWe) on February 9, 1991. The analyses consisted of the transient thermal-hydraulic analysis and evaluation analysis of pressure vessel integrity during PTS (Pressurized Thermal Shock) event. The objective of the former analysis was to obtain better understanding of the event by reproducing the thermal-hydraulic behavior during the event as close as possible. The analysis successfully reproduced the overall behavior of the major plant parameters such as the primary system and secondary system pressures, pressurizer water level and so forth by adjusting the discharge coefficient for the break flow so that the calculated scram time could agree with that at the event. The objective of the latter analysis was to evaluate how severe the event was from the PTS point of view. As a result of the analysis, it was shown that the temperature decrease of the pressure vessel material at the downcomer was not large and therefore there existed sufficient margin for the propagation of the initial crack at the inner surface of the pressure vessel, even though a hypothetically large initial crack was assumed to exist. (author)
Chung, Sun-Ju; Koo, Jae-Rim
2014-01-01
Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over the peak, confident planet detection did not happen due to extremely weak central perturbations (fractional deviations of $\\lesssim 2\\%$). For confident detection of planets in extremely weak central perturbation (EWCP) events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems.The next-generation ground-based observation project, KMTNet (Korea Microlensing Telescope Network), satisfies the conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of $\\leq 2\\%$ in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of $> 50\\%$ in the case of $\\lesssim 100\\ M_{\\rm E}$ planets with separations of $0.2\\ {\\rm AU} \\lesssim d \\lesssim 20\\ {\\rm AU}$. We find that for m...
Margin assessment of a nuclear power plant against external hazards is one of the most important issues after Fukushima Dai-ichi Nuclear Power Plant Accident. In general, an event tree (ET) method is applied to investigate the margin and the effectiveness of countermeasures against the hazard in which a scenario of the plant status and the timing of the countermeasures are determined and a success or failure probability of each event or countermeasure is assumed mainly based on an engineering judgment. In order to investigate the plant dynamics during the scenario examined in the ET method, a numerical simulation is carried out in accordance to the sequentially-presented events. However, a number of scenarios are depicted potentially in a real situation. In the present paper, a new approach has been developed to assess the plant status during external hazards and countermeasures against them in operation quantitatively and stochastically by taking possible scenarios into account. For this purpose, a Continuous Markov chain Monte Carlo (CMMC) method is applied. Furthermore, a preliminary event sequence assessment has been carried out under the condition of deep snow in a loop type sodium cooled fast reactor. (author)
Amir, Sahar Z.
2013-05-01
We introduce an efficient thermodynamically consistent technique to extrapolate and interpolate normalized Canonical NVT ensemble averages like pressure and energy for Lennard-Jones (L-J) fluids. Preliminary results show promising applicability in oil and gas modeling, where accurate determination of thermodynamic properties in reservoirs is challenging. The thermodynamic interpolation and thermodynamic extrapolation schemes predict ensemble averages at different thermodynamic conditions from expensively simulated data points. The methods reweight and reconstruct previously generated database values of Markov chains at neighboring temperature and density conditions. To investigate the efficiency of these methods, two databases corresponding to different combinations of normalized density and temperature are generated. One contains 175 Markov chains with 10,000,000 MC cycles each and the other contains 3000 Markov chains with 61,000,000 MC cycles each. For such massive database creation, two algorithms to parallelize the computations have been investigated. The accuracy of the thermodynamic extrapolation scheme is investigated with respect to classical interpolation and extrapolation. Finally, thermodynamic interpolation benefiting from four neighboring Markov chains points is implemented and compared with previous schemes. The thermodynamic interpolation scheme using knowledge from the four neighboring points proves to be more accurate than the thermodynamic extrapolation from the closest point only, while both thermodynamic extrapolation and thermodynamic interpolation are more accurate than the classical interpolation and extrapolation. The investigated extrapolation scheme has great potential in oil and gas reservoir modeling.That is, such a scheme has the potential to speed up the MCMC thermodynamic computation to be comparable with conventional Equation of State approaches in efficiency. In particular, this makes it applicable to large-scale optimization of L
Nutton, Jennifer; Fast, Elizabeth
2015-01-01
Indigenous peoples the world over have and continue to experience the devastating effects of colonialism including loss of life, land, language, culture, and identity. Indigenous peoples suffer disproportionately across many health risk factors including an increased risk of substance use. We use the term "Big Event" to describe the historical trauma attributed to colonial policies as a potential pathway to explain the disparity in rates of substance use among many Indigenous populations. We present "Big Solutions" that have the potential to buffer the negative effects of the Big Event, including: (1) decolonizing strategies, (2) identity development, and (3) culturally adapted interventions. Study limitations are noted and future needed research is suggested. PMID:26158749
Monte Carlo simulation of virtual Compton scattering below pion threshold
This paper describes the Monte Carlo simulation developed specifically for the Virtual Compton Scattering (VCS) experiments below pion threshold that have been performed at MAMI and JLab. This simulation generates events according to the (Bethe-Heitler + Born) cross-section behaviour and takes into account all relevant resolution-deteriorating effects. It determines the 'effective' solid angle for the various experimental settings which are used for the precise determination of the photon electroproduction absolute cross-section
Based on the GO methodology and Markov method, the dynamic analysis of emergency diesel generator system for protecting the nuclear power plant from Station Blackout, which is caused by Loss of Offsite Power event, is made with duration of 24 hours. In addition, the accurate reliability calculation problem is solved for the repairable system with dependant maintenance relation, and the logic relation of emergency response system is fully simulated by creating the 'Backup Operator' of the GO methodology. By combining the two reliability analysis methods, which is used suitably for the emergency response system of diesel generators with dependant maintenance relation, the application range for the two methods is expanded, and the effect of station blackout event on the safety operation of nuclear power plants can be obtained more accurately. (author)
A knowledge engineering approach to operation support system would be useful in maintaining safe and steady operation in nuclear plants. This paper describes a knowledge-based operation support system which assists the operators during steam generator water leak events in FBR plants. We have developed a real-time expert system. The expert system adopts hierarchical knowledge representation corresponding to the 'plant abnormality model'. A technique of signal validation which uses knowledge of symptom propagation are applied to diagnosis. In order to verify the knowledge base concerning steam generator water leak events in FBR plants, a simulator is linked to the expert system. It is revealed that diagnosis based on 'plant abnormality model' and signal validation using knowledge of symptom propagation could work successfully. Also, it is suggested that the expert system could be useful in supporting FBR plants operations. (author)
Baryon Diffusion Constant in Hot and Dense Hadronic Matter Based on an Event Generator Urasima
Sasaki, N; Miyamura, O.; Muroya, S.; Nonaka, C.
2000-01-01
We generate the statistical ensembles in equilibrium with fixed temperature and chemical potential by imposing periodic boundary condition to the simulation of URASiMA(Ultra-Relativistic AA collision Simulator based on Multiple Scattering Algorithm). By using the generated ensembles, we investigate the temperature dependence and the chemical potential dependence of the nucleon diffusion constant of a dense and hot hadronic matter.
Hall, Sara F.
2016-01-01
The controversial 2012 ZDF mini-series Unsere Mütter, unsere Väter/Generation War epitomizes German “historical event television,” a broadcasting trend aligned with the recent tendency to normalize the nation’s relationship with its past. Reaching beyond the borders of the narrative drama, the series’ producers created and promoted collateral content encouraging viewers to interact with the fictional characters and other audience members in such new media platforms as Facebook, a pu...