ATLAS Data Preparation in Run 2
Laycock, Paul; The ATLAS collaboration
2016-01-01
In this presentation, the data preparation workflows for Run 2 are presented. Online data quality uses a new hybrid software release that incorporates the latest offline data quality monitoring software for the online environment. This is used to provide fast feedback in the control room during a data acquisition (DAQ) run, via a histogram-based monitoring framework as well as the online Event Display. Data are sent to several streams for offline processing at the dedicated Tier-0 computing facility, including dedicated calibration streams and an "express" physics stream containing approximately 2% of the main physics stream. This express stream is processed as data arrives, allowing a first look at the offline data quality within hours of a run end. A prompt calibration loop starts once an ATLAS DAQ run ends, nominally defining a 48 hour period in which calibrations and alignments can be derived using the dedicated calibration and express streams. The bulk processing of the main physics stream starts on expi...
ATLAS Data Preparation in Run 2
Laycock, Paul; The ATLAS collaboration
2017-01-01
In this contribution, the data preparation workflows for Run 2 are presented. The challenges posed by the excellent performance and high live time fraction of the LHC are discussed, and the solutions implemented by ATLAS are described. The prompt calibration loop procedures are described and examples are given. Several levels of data quality assessment are used to quickly spot problems in the control room and prevent data loss, and to provide the final selection used for physics analysis. Finally the data quality efficiency for physics analysis is shown.
Preparations for p-Au run in 2015
Energy Technology Data Exchange (ETDEWEB)
Liu, C. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.
2014-12-31
The p-Au particle collision is a unique category of collision runs. This is resulted from the different charge mass ratio of the proton and fully stripped Au ion (1 vs.79/197). The p-Au run requires a special acceleration ramp, and movement of a number of beam components as required by the beam trajectories. The DX magnets will be moved for the first time in the history of RHIC. In this note, the planning and preparations for p-Au run will be presented.
Preparations for p-Au run in 2015
Energy Technology Data Exchange (ETDEWEB)
Liu, C. [Brookhaven National Lab. (BNL), Upton, NY (United States). Collider-Accelerator Dept.
2014-12-31
The p-Au particle collision is a unique category of collision runs. This is resulted from the different charge mass ratio of the proton and fully stripped Au ion (1 vs.79/197). The p-Au run requires a special acceleration ramp, and movement of a number of beam components as required by the beam trajectories. The DX magnets will be moved for the first time in the history of RHIC. In this note, the planning and preparations for p-Au run will be presented.
Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).
Yang, Owen; Choi, Bernard
2013-01-01
To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.
LHCb Vertex Locator: Performance and radiation damage in LHC Run 1 and preparation for Run 2
Szumlak, T.; Obła˛kowska-Mucha, A.
2016-07-01
LHCb is a dedicated experiment to study New Physics in the decays of heavy hadrons at the Large Hadron Collider (LHC) at CERN. Heavy hadrons are identified through their flight distance in the Vertex Locator (VELO). The VELO comprises 42 modules made of two n+-on-n 300 μm thick half-disc silicon sensors with R- and Φ-measuring micro-strips. In order to allow retracting the detector, the VELO is installed as two movable halves containing 21 modules each. The detectors are operated in a secondary vacuum and are cooled by a bi-phase CO2 cooling system. During data taking in LHC Run 1 the LHCb VELO has operated with an extremely high efficiency and excellent performance. The track finding efficiency is typically greater than 98%. An impact parameter resolution of less than 35 μm is achieved for particles with transverse momentum greater than 1 GeV/c. An overview of all important performance parameters will be given. The VELO sensors have received a large and non-uniform radiation dose of up to 1.2 ×1014 1 MeV neutron equivalent cm-2 during the first LHC run. Silicon type-inversion has been observed in regions close to the interaction point. The preparations for LHC Run 2 are well under way and the VELO has already recorded tracks from injection line tests. The current status and plans for new operational procedures addressing the non-uniform radiation damage are shortly discussed.
Analysis of probabilistic short run marginal cost using Monte Carlo method
Energy Technology Data Exchange (ETDEWEB)
Gutierrez-Alcaraz, G.; Navarrete, N.; Tovar-Hernandez, J.H.; Fuerte-Esquivel, C.R. [Inst. Tecnologico de Morelia, Michoacan (Mexico). Dept. de Ing. Electrica y Electronica; Mota-Palomino, R. [Inst. Politecnico Nacional (Mexico). Escuela Superior de Ingenieria Mecanica y Electrica
1999-11-01
The structure of the Electricity Supply Industry is undergoing dramatic changes to provide new services options. The main aim of this restructuring is allowing generating units the freedom of selling electricity to anybody they wish at a price determined by market forces. Several methodologies have been proposed in order to quantify different costs associated with those new services offered by electrical utilities operating under a deregulated market. The new wave of pricing is heavily influenced by economic principles designed to price products to elastic market segments on the basis of marginal costs. Hence, spot pricing provides the economic structure for many of new services. At the same time, the pricing is influenced by uncertainties associated to the electric system state variables which defined its operating point. In this paper, nodal probabilistic short run marginal costs are calculated, considering as random variables the load, the production cost and availability of generators. The effect of the electrical network is evaluated taking into account linearized models. A thermal economic dispatch is used to simulate each operational condition generated by Monte Carlo method on small fictitious power system in order to assess the effect of the random variables on the energy trading. First, this is carry out by introducing each random variable one by one, and finally considering the random interaction of all of them.
Lower Three Runs Remediation Safety Preparation Strategy - 13318
Energy Technology Data Exchange (ETDEWEB)
Mackay, Alexander; Fryar, Scotty; Doane, Alan [United States Department of Energy, Building 730-B, Aiken, SC 29808 (United States)
2013-07-01
The Savannah River Site (SRS) is a 310-square-mile United States Department of Energy (US DOE) nuclear facility located along the Savannah River near Aiken, South Carolina that contains six primary stream/river systems. The Lower Three Runs Stream (LTR) is one of the primary streams within the site that is located in the southeast portion of the Savannah River Site. It is a large blackwater stream system that originates in the northeast portion of SRS and follows a southerly direction before it enters the Savannah River. During reactor operations, secondary reactor cooling water, storm sewer discharges, and miscellaneous wastewater was discharged and contaminated a 20 mile stretch of Lower Three Runs Stream that narrows and provides a limited buffer of US DOE property along the stream and flood-plain. Based on data collected during the years 2009 and 2010 under American Recovery and Re-investment Act funding, the stream was determined to be contaminated with cesium-137 at levels that exceeded acceptable risk based limits. In agreement with the Environmental Protection Agency and the South Carolina Department of Health and Environmental Control, three areas were identified for remediation [1] (SRNS April 2012). A comprehensive safety preparation strategy was developed for safe execution of the LTR remediation project. Contract incentives for safety encouraged the contractor to perform a complete evaluation of the work and develop an implementation plan to perform the work. The safety coverage was controlled to ensure all work was observed and assessed by one person per work area within the project. This was necessary due to the distances between the fence work and three transects being worked, approximately 20 miles. Contractor Management field observations were performed along with DOE assessments to ensure contractor focus on safe performance of the work. Dedicated ambulance coverage for remote worker work activities was provided. This effort was augmented with
Cepeda, Jose; Luna, Byron Quan; Nadim, Farrokh
2013-04-01
An essential component of a quantitative landslide hazard assessment is establishing the extent of the endangered area. This task requires accurate prediction of the run-out behaviour of a landslide, which includes the estimation of the run-out distance, run-out width, velocities, pressures, and depth of the moving mass and the final configuration of the deposits. One approach to run-out modelling is to reproduce accurately the dynamics of the propagation processes. A number of dynamic numerical models are able to compute the movement of the flow over irregular topographic terrains (3-D) controlled by a complex interaction between mechanical properties that may vary in space and time. Given the number of unknown parameters and the fact that most of the rheological parameters cannot be measured in the laboratory or field, the parametrization of run-out models is very difficult in practice. For this reason, the application of run-out models is mostly used for back-analysis of past events and very few studies have attempted to achieve forward predictions. Consequently all models are based on simplified descriptions that attempt to reproduce the general features of the failed mass motion through the use of parameters (mostly controlling shear stresses at the base of the moving mass) which account for aspects not explicitly described or oversimplified. The uncertainties involved in the run-out process have to be approached in a stochastic manner. It is of significant importance to develop methods for quantifying and properly handling the uncertainties in dynamic run-out models, in order to allow a more comprehensive approach to quantitative risk assessment. A method was developed to compute the variation in run-out intensities by using a dynamic run-out model (MassMov2D) and a probabilistic framework based on a Monte Carlo simulation in order to analyze the effect of the uncertainty of input parameters. The probability density functions of the rheological parameters
LHCb : First years of running for the LHCb calorimeter system and preparation for run 2
Chefdeville, Maximilien
2015-01-01
The LHCb experiment is dedicated to precision measurements of CP violation and rare decays of B hadrons at the Large Hadron Collider (LHC) at CERN (Geneva). It comprises a calorimeter system composed of four subdetectors: a Scintillating Pad Detector (SPD) and a Pre-Shower detector (PS) in front of an electromagnetic calorimeter (ECAL) which is followed by a hadron calorimeter (HCAL). They are used to select transverse energy hadron, electron and photon candidates for the first trigger level and they provides the identification of electrons, photons and hadrons as well as the measurement of their energies and positions. The calorimeter has been pre-calibrated before its installation in the pit. The calibration techniques have been tested with data taken in 2010 and used regularly during run 1. For run 2, new calibration methods have been devised to follow and correct online the calorimeter detector response. The design and construction characteristics of the LHCb calorimeter will be recalled. Strategies for...
ALICE installs new hardware in preparation for the 2012 run
CERN Bulletin and ALICE Matters
2012-01-01
2011 was a fantastic year for the heavy-ion run at ALICE despite unprecedented challenges and difficult conditions. The data collected is at least one order of magnitude greater than the 2010 data. Thanks to a planned upgrade to two subdetectors during the 2011/2012 winter shutdown and a reorganisation of ALICE’s Physics Working Groups that should allow them to better deal with the greater challenges imposed by the LHC, the collaboration is confident that the 2011 run will allow ALICE to extend its physics reach and improve its performance. Photograph of ALICE taken by Antonio Saba during this year's winter shutdown. The annual winter shutdown has been a very intense period for the ALICE collaboration. In conjunction with the general maintenance, modifications and tests of the experiment, two major projects – the installation of 3 supermodules of the Transition Radiation Detector (TRD) and 2 supermodules of the Electromagnetic Calorimeter (EMCal) – hav...
The CMS muon system in Run2: preparation, status and first results
AUTHOR|(CDS)2072274
2015-01-01
The CMS muon system has played a key role for many physics results obtained from the LHC Run-1 data. During the Long Shutdown (2013-2014) significant upgrades have been carried out on the muon detectors and on the L1 muon trigger. In parallel the algorithms for muon reconstruction and identification have been improved for both the High-Level Trigger and the offline reconstruction. Results of studies performed on data and Monte Carlo simulations will be presented, with focus on the improvements aiming to ensure an excellent performance in conditions of multiplicity of pileup events and bunch spacing expected during the high-luminosity phase of Run-2. The early muon performance results from LHC Run-2 will be shown.
The ATLAS collaboration
2016-01-01
This note documents the Monte Carlo generators used by the ATLAS collaboration at the start of Run 2 for processes where a $W$ or $Z/\\gamma^*$ boson is produced in association with jets. The available event generators are briefly described and comparisons are made with ATLAS measurements of $W$ or $Z/\\gamma^*$+jets performed with Run 1 data, collected at the centre-of-mass energy of 7 TeV. The model predictions are then compared at the Run 2 centre-of-mass energy of 13~TeV. A comparison is also made with an early Run 2 ATLAS $Z/\\gamma^*$+jets data measurement. Investigations into tuning the parameters of the models and evaluating systematic uncertainties on the Monte Carlo predictions are also presented.
CMS operations for Run II preparation and commissioning of the offline infrastructure
Cerminara, Gianluca
2016-01-01
The restart of the LHC coincided with an intense activity for the CMS experiment. Both at the beginning of Run II in 2015 and the restart of operations in 2016, the collaboration was engaged in an extensive re-commissioning of the CMS data-taking operations. After the long stop, the detector was fully aligned and calibrated. Data streams were redesigned, to fit the priorities dictated by the physics program for 2015 and 2016. A new reconstruction software (both online and offline) was commissioned with early collisions and further developed during the year. A massive campaign of Monte Carlo production was launched, to assist physics analyses. This presentation reviews the main event of this commissioning journey and describes the status of CMS physics performances for 2016.
Buist, I.; Bredeweg, S. W.; Bessem, B.; van Mechelen, W.; Lemmink, K. A. P. M.; Diercks, R. L.
2010-01-01
Objective In this study, the incidence and the sex-specific predictors of running-related injury (RRI) among a group of recreational runners training for a 4-mile running event were determined and identified, respectively. Design Prospective cohort study. Methods Several potential risk factors were
Nelson, Benjamin E; Payne, Matthew J
2013-01-01
In the 20+ years of Doppler observations of stars, scientists have uncovered a diverse population of extrasolar multi-planet systems. A common technique for characterizing the orbital elements of these planets is Markov chain Monte Carlo (MCMC), using a Keplerian model with random walk proposals and paired with the Metropolis-Hastings algorithm. For approximately a couple of dozen planetary systems with Doppler observations, there are strong planet-planet interactions due to the system being in or near a mean-motion resonance (MMR). An N-body model is often required to accurately describe these systems. Further computational difficulties arise from exploring a high-dimensional parameter space ($\\sim$7 x number of planets) that can have complex parameter correlations. To surmount these challenges, we introduce a differential evolution MCMC (DEMCMC) applied to radial velocity data while incorporating self-consistent N-body integrations. Our Radial velocity Using N-body DEMCMC (RUN DMC) algorithm improves upon t...
Bayraktar, Başak; Özer Sözdinler, Ceren; Necmioǧlu, Öcal; Meral Özel, Nurcan
2017-04-01
The Marmara Sea and its surrounding is one of the most populated areas in Turkey. Many densely populated cities, such as megacity Istanbul with a population of more than 14 million, a great number of industrial facilities in largest capacity and potential, refineries, ports and harbors are located along the coasts of Marmara Sea. The region is highly seismically active. There has been a wide range of studies in this region regarding the fault mechanisms, seismic activities, earthquakes and triggered tsunamis in the Sea of Marmara. The historical documents reveal that the region has been experienced many earthquakes and tsunamis in the past. According to Altinok et al. (2011), 35 tsunami events happened in Marmara Sea between BC 330 and 1999. As earthquakes are expected in Marmara Sea with the break of segments of North Anatolian Fault (NAF) in the future, the region should be investigated in terms of the possibility of tsunamis by the occurrence of earthquakes in specific return periods. This study aims to make probabilistic tsunami hazard analysis in Marmara Sea. For this purpose, the possible sources of tsunami scenarios are specified by compiling the earthquake catalogues, historical records and scientific studies conducted in the region. After compiling all this data, a synthetic earthquake and tsunami catalogue are prepared using Monte Carlo simulations. For specific return periods, the possible epicenters, rupture lengths, widths and displacements are determined with Monte Carlo simulations assuming the angles of fault segments as deterministic. For each earthquake of synthetic catalogue, the tsunami wave heights will be calculated at specific locations along Marmara Sea. As a further objective, this study will determine the tsunami hazard curves for specific locations in Marmara Sea including the tsunami wave heights and their probability of exceedance. This work is supported by SATREPS-MarDim Project (Earthquake and Tsunami Disaster Mitigation in the
Stangier, Carolin; Abel, Thomas; Hesse, Clemens; Claen, Stephanie; Mierau, Julia; Hollmann, Wildor; Strüder, Heiko K
2016-06-01
Winter weather conditions restrict regular sport-specific endurance training in inline speed skating. As a result, this study was designed to compare the effects of cycling and running training programs on inline speed skaters' endurance performance. Sixteen (8 men, 8 women) high-level athletes (mean ± SD 24 ± 8 years) were randomly assigned to 1 of 2 groups (running and cycling). Both groups trained twice a week for 8 weeks, one group on a treadmill and the other on a cycle ergometer. Training intensity and duration was individually calculated (maximal fat oxidation: ∼52% of V[Combining Dot Above]O2peak: 500 kcal per session). Before and after the training intervention, all athletes performed an incremental specific (inline speed skating) and 1 nonspecific (cycling or running) step test according to the group affiliation. In addition to blood lactate concentration, oxygen uptake (V[Combining Dot Above]O2), ventilatory equivalent (VE/V[Combining Dot Above]O2), respiratory exchange ratio (RER), and heart rate were measured. The specific posttest revealed significantly increased absolute V[Combining Dot Above]O2peak values (2.9 ± 0.4, 3.4 ± 0.7, p = 0.01) and submaximal V[Combining Dot Above]O2 values (p ≤ 0.01). VE/V[Combining Dot Above]O2 and RER significantly decreased at maximal (46.6 ± 6.6, 38.5 ± 3.4, p = 0.005; 1.1 ± 0.03, 1.0 ± 0.04, p = 0.001) and submaximal intensities (p ≤ 0.04). None of the analysis revealed a significant group effect (p ≥ 0.15). The results indicate that both cycling vs. running exercise at ∼52% of V[Combining Dot Above]O2peak had a positive effect on the athletes' endurance performance. The increased submaximal V[Combining Dot Above]O2 values indicate a reduction in athletes' inline speed skating technique. Therefore, athletes would benefit from a focus on technique training in the subsequent period.
Energy Technology Data Exchange (ETDEWEB)
Cacais, F.L.; Delgado, J.U., E-mail: facacais@gmail.com [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Loayza, V.M. [Instituto Nacional de Metrologia (INMETRO), Rio de Janeiro, RJ (Brazil). Qualidade e Tecnologia
2016-07-01
In preparing solutions for the production of radionuclide metrology standards is necessary measuring the quantity Activity by mass. The gravimetric method by elimination is applied to perform weighing with smaller uncertainties. At this work is carried out the validation, by the Monte Carlo method, of the uncertainty calculation approach implemented by Lourenco and Bobin according to ISO GUM for the method by elimination. The results obtained by both uncertainty calculation methods were consistent indicating that were fulfilled the conditions for the application of ISO GUM in the preparation of radioactive standards. (author)
Cabass, Giovanni; Melchiorri, Alessandro; Pajer, Enrico; Silk, Joseph
2016-01-01
We use the recent observations of Cosmic Microwave Background temperature and polarization anisotropies provided by the Planck satellite experiment to place constraints on the running $\\alpha_\\mathrm{s} = \\mathrm{d}n_{\\mathrm{s}} / \\mathrm{d}\\log k$ and the running of the running $\\beta_{\\mathrm{s}} = \\mathrm{d}\\alpha_{\\mathrm{s}} / \\mathrm{d}\\log k$ of the spectral index $n_{\\mathrm{s}}$ of primordial scalar fluctuations. We find $\\alpha_\\mathrm{s}=0.011\\pm0.010$ and $\\beta_\\mathrm{s}=0.027\\pm0.013$ at $68\\%\\,\\mathrm{CL}$, suggesting the presence of a running of the running at the level of two standard deviations. We find no significant correlation between $\\beta_{\\mathrm{s}}$ and foregrounds parameters, with the exception of the point sources amplitude at $143\\,\\mathrm{GHz}$, $A^{PS}_{143}$, which shifts by half sigma when the running of the running is considered. We further study the cosmological implications of this anomaly by including in the analysis the lensing amplitude $A_L$, the curvature parameter ...
... Emergency Room? What Happens in the Operating Room? Running Away KidsHealth > For Kids > Running Away Print A ... life on the streets. continue The Reality of Running Away When you think about running away, you ...
Energy Technology Data Exchange (ETDEWEB)
Brown, F.B.; Sutton, T.M.
1996-02-01
This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.
Latest LHCf results and preparation to the LHC run for 13 TeV proton–proton interactions
Directory of Open Access Journals (Sweden)
Bonechi L.
2015-01-01
Full Text Available The LHCf experiment is a CERN experiment dedicated to forward physics which is optimized to measure the neutral particle flow at extreme pseudo-rapidity values, ranging from 8.4 up to infinity. LHCf results are extremely important for the calibration of the hadronic interaction models used for the study of the development of atmospheric showers in the Earth atmosphere. Starting from the recent run of proton-Lead nucleus interactions at LHC, the LHCf and ATLAS collaborations have performed a common data taking which allows a combined study of the central and forward regions of the interaction. The latest results of LHCf, the upgrade of the detectors for the next 6.5 TeV + 6.5 TeV proton–proton run and the status of the LHCf-ATLAS common activities are summarized in this paper.
Use of trainings in mountain terms in the olympic annual loop of preparation in at run on midranges
Directory of Open Access Journals (Sweden)
Samolenko T.V.
2012-03-01
Full Text Available The variant of the use of training of sportsmen is presented in mountain terms. The basic going is exposed near planning of training in the annual loop of preparation of sportsmen. The features of correlation of facilities of preparation are considered at combination of training in middleland and highland. 4 collections are recommended in mountains in the year of leadthrough of Games of Olympiad in the following orientation: pulling (middleland in, intensive and power (highland, antihunt on a background sufficient intensity (highland, volume and intensive (middleland. Correlation of loadings of different orientation is rotined taking into account the individual features of to training of sportsmen.
Directory of Open Access Journals (Sweden)
Samolenko T.V.
2011-07-01
Full Text Available The article deals with important aspects of sports training distance runner in the Olympic cycle, based on analysis of the construction phase of immediate preparation for major competitions. Shows the magnitude of the means of training and options for their relations over the years the Olympic cycle. Identify indicators of individual variability of investigated parameters loads.
AUTHOR|(CDS)2068005
2016-01-01
Commissioning with low-intensity beams helps prepare CMS for this year’s physics run. This event is one of the first low-intensity collisions recorded in the CMS detector, during the early hours of 23 April 2016
Costa, Ricardo J S; Crockford, Michael J; Moore, Jonathan P; Walsh, Neil P
2014-01-01
Heat acclimation induces adaptations that improve exercise tolerance in hot conditions. Here we report novel findings into the effects of ultra-marathon specific exercise load in increasing hot ambient conditions on indices of heat acclimation. Six male ultra-endurance runners completed a standard pre-acclimation protocol at 20°C ambient temperature (T amb), followed by a heat acclimation protocol consisting of six 2 h running exercise-heat exposures (EH) at 60% VO2max on a motorised treadmill in an environmental chamber. Three EH were performed at 30°C T amb, followed by another three EH at 35°C T amb. EH were separated by 48 h within T amb and 72 h between T amb. Nude body mass (NBM), blood and urine samples were collected pre-exercise; while NBM and urine were collected post-exercise. Rectal temperature (T re), heart rate (HR), thermal comfort rating (TCR) and rating of perceived exertion were measured pre-exercise and monitored every 5 min during exercise. Water was provided ad libitum during exercise. Data were analysed using a repeated measures and one-way analysis of variance (ANOVA), with post hoc Tukey's HSD. Significance was accepted as Pheat acclimation in all ultra-endurance runners. Further, heat acclimation responses occurred with increasing EH to 35°C T amb. Preventing exertional heat illnesses and optimising performance outcomes in ultra-endurance runners may occur with exposure to at least 2 h of exercise-heat stress on at least two occasions in the days leading up to multi-stage ultra-marathon competition in the heat.
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Monte Carlo integration on GPU
Kanzaki, J.
2010-01-01
We use a graphics processing unit (GPU) for fast computations of Monte Carlo integrations. Two widely used Monte Carlo integration programs, VEGAS and BASES, are parallelized on GPU. By using $W^{+}$ plus multi-gluon production processes at LHC, we test integrated cross sections and execution time for programs in FORTRAN and C on CPU and those on GPU. Integrated results agree with each other within statistical errors. Execution time of programs on GPU run about 50 times faster than those in C...
Dalheimer, Matthias Kalle
2006-01-01
The fifth edition of Running Linux is greatly expanded, reflecting the maturity of the operating system and the teeming wealth of software available for it. Hot consumer topics such as audio and video playback applications, groupware functionality, and spam filtering are covered, along with the basics in configuration and management that always made the book popular.
C. Delaere
2013-01-01
Since the LHC ceased operations in February, a lot has been going on at Point 5, and Run Coordination continues to monitor closely the advance of maintenance and upgrade activities. In the last months, the Pixel detector was extracted and is now stored in the pixel lab in SX5; the beam pipe has been removed and ME1/1 removal has started. We regained access to the vactank and some work on the RBX of HB has started. Since mid-June, electricity and cooling are back in S1 and S2, allowing us to turn equipment back on, at least during the day. 24/7 shifts are not foreseen in the next weeks, and safety tours are mandatory to keep equipment on overnight, but re-commissioning activities are slowly being resumed. Given the (slight) delays accumulated in LS1, it was decided to merge the two global runs initially foreseen into a single exercise during the week of 4 November 2013. The aim of the global run is to check that we can run (parts of) CMS after several months switched off, with the new VME PCs installed, th...
Martin, A.; Skeie, D.; von Thadden, E.L.
2010-01-01
This paper develops a model of financial institutions that borrow short- term and invest into long-term marketable assets. Because these financial intermediaries perform maturity transformation, they are subject to runs. We endogenize the profits of the intermediary and derive distinct liquidity and
Running Club
2011-01-01
The cross country running season has started well this autumn with two events: the traditional CERN Road Race organized by the Running Club, which took place on Tuesday 5th October, followed by the ‘Cross Interentreprises’, a team event at the Evaux Sports Center, which took place on Saturday 8th October. The participation at the CERN Road Race was slightly down on last year, with 65 runners, however the participants maintained the tradition of a competitive yet friendly atmosphere. An ample supply of refreshments before the prize giving was appreciated by all after the race. Many thanks to all the runners and volunteers who ensured another successful race. The results can be found here: https://espace.cern.ch/Running-Club/default.aspx CERN participated successfully at the cross interentreprises with very good results. The teams succeeded in obtaining 2nd and 6th place in the Mens category, and 2nd place in the Mixed category. Congratulations to all. See results here: http://www.c...
M. Chamizo
2012-01-01
On 17th January, as soon as the services were restored after the technical stop, sub-systems started powering on. Since then, we have been running 24/7 with reduced shift crew — Shift Leader and DCS shifter — to allow sub-detectors to perform calibration, noise studies, test software upgrades, etc. On 15th and 16th February, we had the first Mid-Week Global Run (MWGR) with the participation of most sub-systems. The aim was to bring CMS back to operation and to ensure that we could run after the winter shutdown. All sub-systems participated in the readout and the trigger was provided by a fraction of the muon systems (CSC and the central RPC wheel). The calorimeter triggers were not available due to work on the optical link system. Initial checks of different distributions from Pixels, Strips, and CSC confirmed things look all right (signal/noise, number of tracks, phi distribution…). High-rate tests were done to test the new CSC firmware to cure the low efficiency ...
Christophe Delaere
2013-01-01
The focus of Run Coordination during LS1 is to monitor closely the advance of maintenance and upgrade activities, to smooth interactions between subsystems and to ensure that all are ready in time to resume operations in 2015 with a fully calibrated and understood detector. After electricity and cooling were restored to all equipment, at about the time of the last CMS week, recommissioning activities were resumed for all subsystems. On 7 October, DCS shifts began 24/7 to allow subsystems to remain on to facilitate operations. That culminated with the Global Run in November (GriN), which took place as scheduled during the week of 4 November. The GriN has been the first centrally managed operation since the beginning of LS1, and involved all subdetectors but the Pixel Tracker presently in a lab upstairs. All nights were therefore dedicated to long stable runs with as many subdetectors as possible. Among the many achievements in that week, three items may be highlighted. First, the Strip...
G. Rakness.
2013-01-01
After three years of running, in February 2013 the era of sub-10-TeV LHC collisions drew to an end. Recall, the 2012 run had been extended by about three months to achieve the full complement of high-energy and heavy-ion physics goals prior to the start of Long Shutdown 1 (LS1), which is now underway. The LHC performance during these exciting years was excellent, delivering a total of 23.3 fb–1 of proton-proton collisions at a centre-of-mass energy of 8 TeV, 6.2 fb–1 at 7 TeV, and 5.5 pb–1 at 2.76 TeV. They also delivered 170 μb–1 lead-lead collisions at 2.76 TeV/nucleon and 32 nb–1 proton-lead collisions at 5 TeV/nucleon. During these years the CMS operations teams and shift crews made tremendous strides to commission the detector, repeatedly stepping up to meet the challenges at every increase of instantaneous luminosity and energy. Although it does not fully cover the achievements of the teams, a way to quantify their success is the fact that that...
Energy Technology Data Exchange (ETDEWEB)
Cullen, D E
1998-11-22
TART98 is a coupled neutron-photon, 3 Dimensional, combinatorial geometry, time dependent Monte Carlo radiation transport code. This code can run on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART98 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system can save you a great deal of time and energy. TART98 is distributed on CD. This CD contains on-line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART98 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART98 and its data files.
Energy Technology Data Exchange (ETDEWEB)
Dang, Phuong; Koeneke, Karsten; Jakobs, Karl [Institute of Physics, University of Freiburg (Germany); Collaboration: ATLAS-Collaboration
2015-07-01
After a successful first run with 7-8 TeV center-of-mass energy, the Large Hadron Collider (LHC) is currently being upgraded for the next run with 13-14 TeV. At the same time, the ATLAS experiment is also developing its detectors and analysis framework for the second run. A new optimisation of the selection criteria of the gluon-gluon-fusion induced H→WW{sup *}→lνlν process is presented, considering the significant changes of the production cross-sections of signal and background processes, as well as changed pileup and detector conditions. These studies are based on the new simulation and the new software that was developed over the last two years.
TART97 a coupled neutron-photon 3-D, combinatorial geometry Monte Carlo transport code
Energy Technology Data Exchange (ETDEWEB)
Cullen, D.E.
1997-11-22
TART97 is a coupled neutron-photon, 3 Dimensional, combinatorial geometry, time dependent Monte Carlo transport code. This code can on any modern computer. It is a complete system to assist you with input preparation, running Monte Carlo calculations, and analysis of output results. TART97 is also incredibly FAST; if you have used similar codes, you will be amazed at how fast this code is compared to other similar codes. Use of the entire system can save you a great deal of time and energy. TART97 is distributed on CD. This CD contains on- line documentation for all codes included in the system, the codes configured to run on a variety of computers, and many example problems that you can use to familiarize yourself with the system. TART97 completely supersedes all older versions of TART, and it is strongly recommended that users only use the most recent version of TART97 and its data riles.
Running Club
2010-01-01
The 2010 edition of the annual CERN Road Race will be held on Wednesday 29th September at 18h. The 5.5km race takes place over 3 laps of a 1.8 km circuit in the West Area of the Meyrin site, and is open to everyone working at CERN and their families. There are runners of all speeds, with times ranging from under 17 to over 34 minutes, and the race is run on a handicap basis, by staggering the starting times so that (in theory) all runners finish together. Children (< 15 years) have their own race over 1 lap of 1.8km. As usual, there will be a “best family” challenge (judged on best parent + best child). Trophies are awarded in the usual men’s, women’s and veterans’ categories, and there is a challenge for the best age/performance. Every adult will receive a souvenir prize, financed by a registration fee of 10 CHF. Children enter free (each child will receive a medal). More information, and the online entry form, can be found at http://cern.ch/club...
Christophe Delaere
2012-01-01
On Wednesday 14 March, the machine group successfully injected beams into LHC for the first time this year. Within 48 hours they managed to ramp the beams to 4 TeV and proceeded to squeeze to β*=0.6m, settings that are used routinely since then. This brought to an end the CMS Cosmic Run at ~Four Tesla (CRAFT), during which we collected 800k cosmic ray events with a track crossing the central Tracker. That sample has been since then topped up to two million, allowing further refinements of the Tracker Alignment. The LHC started delivering the first collisions on 5 April with two bunches colliding in CMS, giving a pile-up of ~27 interactions per crossing at the beginning of the fill. Since then the machine has increased the number of colliding bunches to reach 1380 bunches and peak instantaneous luminosities around 6.5E33 at the beginning of fills. The average bunch charges reached ~1.5E11 protons per bunch which results in an initial pile-up of ~30 interactions per crossing. During the ...
C. Delaere
2012-01-01
With the analysis of the first 5 fb–1 culminating in the announcement of the observation of a new particle with mass of around 126 GeV/c2, the CERN directorate decided to extend the LHC run until February 2013. This adds three months to the original schedule. Since then the LHC has continued to perform extremely well, and the total luminosity delivered so far this year is 22 fb–1. CMS also continues to perform excellently, recording data with efficiency higher than 95% for fills with the magnetic field at nominal value. The highest instantaneous luminosity achieved by LHC to date is 7.6x1033 cm–2s–1, which translates into 35 interactions per crossing. On the CMS side there has been a lot of work to handle these extreme conditions, such as a new DAQ computer farm and trigger menus to handle the pile-up, automation of recovery procedures to minimise the lost luminosity, better training for the shift crews, etc. We did suffer from a couple of infrastructure ...
Directory of Open Access Journals (Sweden)
M.M. Dardir
2014-03-01
Full Text Available Some hexanamide-mono and di-linoleniate esters were prepared by the reaction of linolenic acid and hexanamide (derived from the reaction of hexanoic acid and diethanolamine. The chemical structure for the newly prepared hexanamide-mono and di-linoleniate esters were elucidated using elemental analysis, (FTIR, H 1NMR and chemical ionization mass spectra (CI/Ms spectroscopic techniques. The results of the spectroscopic analysis indicated that they were prepared through the right method and they have high purity. The new prepared esters have high biodegradability and lower toxicity (environmentally friendly so they were evaluated as a synthetic-based mud (ester-based mud for oil-well drilling fluids. The evaluation included study of the rheological properties, filtration and thermal properties of the ester based-muds formulated with the newly prepared esters compared to the reference commercial synthetic-based mud.
Fast quantum Monte Carlo on a GPU
Lutsyshyn, Y
2013-01-01
We present a scheme for the parallelization of quantum Monte Carlo on graphical processing units, focusing on bosonic systems and variational Monte Carlo. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent acceleration. Comparing with single core execution, GPU-accelerated code runs over x100 faster. The CUDA code is provided along with the package that is necessary to execute variational Monte Carlo for a system representing liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the latest Kepler architecture K20 GPU. Kepler-specific optimization is discussed.
Scalable Domain Decomposed Monte Carlo Particle Transport
Energy Technology Data Exchange (ETDEWEB)
O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)
2013-12-05
In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.
Changes in running economy following downhill running.
Chen, Trevor C; Nosaka, Kazunori; Tu, Jui-Hung
2007-01-01
In this study, we examined the time course of changes in running economy following a 30-min downhill (-15%) run at 70% peak aerobic power (VO2peak). Ten young men performed level running at 65, 75, and 85% VO2peak (5 min for each intensity) before, immediately after, and 1 - 5 days after the downhill run, at which times oxygen consumption (VO2), minute ventilation, the respiratory exchange ratio (RER), heart rate, ratings of perceived exertion (RPE), and blood lactate concentration were measured. Stride length, stride frequency, and range of motion of the ankle, knee, and hip joints during the level runs were analysed using high-speed (120-Hz) video images. Downhill running induced reductions (7 - 21%, P run. Oxygen consumption increased (4 - 7%, P stride frequency, as well as reductions in stride length and range of motion of the ankle and knee. The results suggest that changes in running form and compromised muscle function due to muscle damage contribute to the reduction in running economy for 3 days after downhill running.
Directory of Open Access Journals (Sweden)
2008-06-01
Full Text Available
One of the new features of Python 2.5 is the introduction of ctypes as a standard library module. At the simplest level, ctypes adds the standard C types to Python: signed and unsigned bytes, shorts, ints and longs; as well as structs, unions, pointers and functions. At run-time it can load a shared library (DLL and import its symbols, allowing a Python application to make function calls into the library without any special preparation.¬† ctypes can be used to wrap native libraries in place of interface generators such as SWIG, to manipulate memory and Python objects at the lowest level, and to prototype application development in other languages.
This paper begins with a quick introduction to ctypes, shows some advanced techniques, and describes some examples of how it has been used by the author in his recent work.
The CDF Run 2 Offline Computer Farms
Institute of Scientific and Technical Information of China (English)
JaroslavAntos; TanyaLevshina; 等
2001-01-01
Run 2 at Fermilab began in March,2001,CDF will collect data at a maximum rate of 20 MByte/sec during the run.The offline reconstruction of this data must keep up with the data taking rate.This reconstruction occurs on a large PC farm,which must have the capacity for quasi-real time data reconstruction,for reprocessing of some data and for generation and processing of Monte Carlo samples.In this paer we will give the design requirements ofr the farm,describe the hardware and software design used to meet those requirements,describe the early experiences with Run 2 data processing,and discussfuture prospects for the farm,including some ideas about Run 2b processing.
Can Unshod Running Reduce Running Injuries?
2012-06-08
quadrupeds run, their internal organs expand and contract like an accordion as they stride when running. As a cheetah strides forward, its lungs expand...and take in air. When the cheetah compresses its stride, the lungs are collapsed and the cheetah breathes out. This take-a-step and take-a- breath
Iba, Yukito
2000-01-01
``Extended Ensemble Monte Carlo''is a generic term that indicates a set of algorithms which are now popular in a variety of fields in physics and statistical information processing. Exchange Monte Carlo (Metropolis-Coupled Chain, Parallel Tempering), Simulated Tempering (Expanded Ensemble Monte Carlo), and Multicanonical Monte Carlo (Adaptive Umbrella Sampling) are typical members of this family. Here we give a cross-disciplinary survey of these algorithms with special emphasis on the great f...
Biomechanics of Distance Running.
Cavanagh, Peter R., Ed.
Contributions from researchers in the field of running mechanics are included in the 13 chapters of this book. The following topics are covered: (1) "The Mechanics of Distance Running: A Historical Perspective" (Peter Cavanagh); (2) "Stride Length in Distance Running: Velocity, Body Dimensions, and Added Mass Effects" (Peter Cavanagh, Rodger…
Institute of Scientific and Technical Information of China (English)
唐晓东; 刘亮; 税蕾蕾
2005-01-01
Large investment, high operating cost and severe operation condition exist in the technology of diesel hydrodesulfurization, and the technology of H2O2 oxidative desulfurization of diesel has such problems as high oxidizer cost, oxidizer not regenerable, and treatment of sour water. A new catalytic oxidative desulfurization method for straight-run diesel is presented in this paper. In order to produce lowsulfur diesel, the sulfide in diesel oil was oxidized and converted into sulfone-polar sulfide with homogeneous catalysts and air oxidizer, and then removed by extractant. The homogenous catalysts were prepared by compound decomposition. The catalysts selected could dissolve in diesel at a given temperature and separate out at a lower temperature. The effects of catalytic oxidation of zinc benzoate, manganese benzoate and manganese phthalate were tested. The desulfurization effect of zinc benzoate and manganese benzoate was much better and the sulfur content of the desulfurized diesel met the standard of Eu Ⅱ diesel (<300μg·g-1 ).
A brief introduction to Monte Carlo simulation.
Bonate, P L
2001-01-01
Simulation affects our life every day through our interactions with the automobile, airline and entertainment industries, just to name a few. The use of simulation in drug development is relatively new, but its use is increasing in relation to the speed at which modern computers run. One well known example of simulation in drug development is molecular modelling. Another use of simulation that is being seen recently in drug development is Monte Carlo simulation of clinical trials. Monte Carlo simulation differs from traditional simulation in that the model parameters are treated as stochastic or random variables, rather than as fixed values. The purpose of this paper is to provide a brief introduction to Monte Carlo simulation methods.
Bardenet, R.
2012-01-01
ISBN:978-2-7598-1032-1; International audience; Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC) methods. We give intuition on the theoretic...
Dunn, William L
2012-01-01
Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble
1995-01-01
We discuss the renormalization group improved effective action and running surface couplings in curved spacetime with boundary. Using scalar self-interacting theory as an example, we study the influence of the boundary effects to effective equations of motion in spherical cap and the relevance of surface running couplings to quantum cosmology and symmetry breaking phenomenon. Running surface couplings in the asymptotically free SU(2) gauge theory are found.
40 CFR 86.1237-85 - Dynamometer runs.
2010-07-01
... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Dynamometer runs. 86.1237-85 Section... Methanol-Fueled Heavy-Duty Vehicles § 86.1237-85 Dynamometer runs. (a) The vehicle shall be either driven... the diurnal loss test and beginning of the hot soak preparation run shall not exceed 3 minutes,...
DEFF Research Database (Denmark)
Larsen, Lars Henrik; Rasmussen, Sten; Jørgensen, Jens Erik
2016-01-01
What is an overuse injury in running? This question is a corner stone of clinical documentation and research based evidence.......What is an overuse injury in running? This question is a corner stone of clinical documentation and research based evidence....
Institute of Scientific and Technical Information of China (English)
PHILIP JONES
2010-01-01
@@ For some, simply running 21 km, or a full marathon at 42 kin, isn't enough of an achievement. I mean, you can run a marathon in almost every major city in the world and many of them are centerpiece events watched by a global audience.
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
Institute of Scientific and Technical Information of China (English)
MICHAEL; GOLD
2009-01-01
Istarted running at age 14, inspired in equal parts by an incipient teenage desire for athletic greatness, the movie Personal Best, and the fact that all my classmates on sports teams got a free period during gym class.
Institute of Scientific and Technical Information of China (English)
Wei Jiafu
2006-01-01
@@ In Africa, there live antelopes and lions.In the morning, the antelope wakes up from sleep. His first sense is that he has to run faster than the fastest lion, otherwise, he will be eaten out. In the meanwhile, when the lion opens his eyes, his first thought is he must run faster than the slowest antelope,otherwise, he will starve to death.
Running Club - Nocturne des Evaux
Running club
2017-01-01
Les coureurs du CERN sont encore montés sur les plus hautes marches du podium lors de la course interentreprises. Cette course d’équipe qui se déroule de nuit et par équipe de 3 à 4 coureurs est unique dans la région de par son originalité : départ groupé toutes les 30 secondes, les 3 premiers coureurs doivent passer la ligne d’arrivée ensemble. Double victoire pour le running club a la nocturne !!!! 1ère place pour les filles et 22e au classement général; 1ère place pour l'équipe mixte et 4e au général, battant par la même occasion le record de l'épreuve en mixte d'environ 1 minute; 10e place pour l'équipe homme. Retrouvez tous les résultats sur http://www.chp-geneve.ch/web-cms/index.php/nocturne-des-evaux
Quantum Monte Carlo simulation
Wang, Yazhen
2011-01-01
Contemporary scientific studies often rely on the understanding of complex quantum systems via computer simulation. This paper initiates the statistical study of quantum simulation and proposes a Monte Carlo method for estimating analytically intractable quantities. We derive the bias and variance for the proposed Monte Carlo quantum simulation estimator and establish the asymptotic theory for the estimator. The theory is used to design a computational scheme for minimizing the mean square er...
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
Petroff, P
1999-01-01
The D0 detector at The Fermilab Tevatron is undergoing a major upgrade to prepare for data taking with luminosities reaching 2 x 10^{32} cm^{-2} s^{-1}. The upgrade includes a new central tracking array, new muon detector components and electronic upgrades to many subsystems. The D0 upgraded detector will be operational for RUN II in spring 2000.
Distributed and Adaptive Darting Monte Carlo through Regenerations
Ahn, S.; Chen, Y.; Welling, M.
2013-01-01
Darting Monte Carlo (DMC) is a MCMC procedure designed to effectively mix between multiple modes of a probability distribution. We propose an adaptive and distributed version of this method by using regenerations. This allows us to run multiple chains in parallel and adapt the shape of the jump regi
Toporek, Chuck
2008-01-01
When Steve Jobs jumped on stage at Macworld San Francisco 2006 and announced the new Intel-based Macs, the question wasn't if, but when someone would figure out a hack to get Windows XP running on these new "Mactels." Enter Boot Camp, a new system utility that helps you partition and install Windows XP on your Intel Mac. Boot Camp does all the heavy lifting for you. You won't need to open the Terminal and hack on system files or wave a chicken bone over your iMac to get XP running. This free program makes it easy for anyone to turn their Mac into a dual-boot Windows/OS X machine. Running Bo
Prevention of running injuries.
Fields, Karl B; Sykes, Jeannie C; Walker, Katherine M; Jackson, Jonathan C
2010-01-01
Evidence for preventive strategies to lessen running injuries is needed as these occur in 40%-50% of runners on an annual basis. Many factors influence running injuries, but strong evidence for prevention only exists for training modification primarily by reducing weekly mileage. Two anatomical factors - cavus feet and leg length inequality - demonstrate a link to injury. Weak evidence suggests that orthotics may lessen risk of stress fracture, but no clear evidence proves they will reduce the risk of those athletes with leg length inequality or cavus feet. This article reviews other potential injury variables, including strength, biomechanics, stretching, warm-up, nutrition, psychological factors, and shoes. Additional research is needed to determine whether interventions to address any of these will help prevent running injury.
Rummel, Juergen; Blum, Yvonne; Seyfarth, Andre
The implementation of bipedal gaits in legged robots is still a challenge in state-of-the-art engineering. Human gaits could be realized by imitating human leg dynamics where a spring-like leg behavior is found as represented in the bipedal spring-mass model. In this study we explore the gap between walking and running by investigating periodic gait patterns. We found an almost continuous morphing of gait patterns between walking and running. The technical feasibility of this transition is, however, restricted by the duration of swing phase. In practice, this requires an abrupt gait transition between both gaits, while a change of speed is not necessary.
Challenges of Monte Carlo Transport
Energy Technology Data Exchange (ETDEWEB)
Long, Alex Roberts [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-10
These are slides from a presentation for Parallel Summer School at Los Alamos National Laboratory. Solving discretized partial differential equations (PDEs) of interest can require a large number of computations. We can identify concurrency to allow parallel solution of discrete PDEs. Simulated particles histories can be used to solve the Boltzmann transport equation. Particle histories are independent in neutral particle transport, making them amenable to parallel computation. Physical parameters and method type determine the data dependencies of particle histories. Data requirements shape parallel algorithms for Monte Carlo. Then, Parallel Computational Physics and Parallel Monte Carlo are discussed and, finally, the results are given. The mesh passing method greatly simplifies the IMC implementation and allows simple load-balancing. Using MPI windows and passive, one-sided RMA further simplifies the implementation by removing target synchronization. The author is very interested in implementations of PGAS that may allow further optimization for one-sided, read-only memory access (e.g. Open SHMEM). The MPICH_RMA_OVER_DMAPP option and library is required to make one-sided messaging scale on Trinitite - Moonlight scales poorly. Interconnect specific libraries or functions are likely necessary to ensure performance. BRANSON has been used to directly compare the current standard method to a proposed method on idealized problems. The mesh passing algorithm performs well on problems that are designed to show the scalability of the particle passing method. BRANSON can now run load-imbalanced, dynamic problems. Potential avenues of improvement in the mesh passing algorithm will be implemented and explored. A suite of test problems that stress DD methods will elucidate a possible path forward for production codes.
An Introduction to Multilevel Monte Carlo for Option Valuation
Higham, Desmond J
2015-01-01
Monte Carlo is a simple and flexible tool that is widely used in computational finance. In this context, it is common for the quantity of interest to be the expected value of a random variable defined via a stochastic differential equation. In 2008, Giles proposed a remarkable improvement to the approach of discretizing with a numerical method and applying standard Monte Carlo. His multilevel Monte Carlo method offers an order of speed up given by the inverse of epsilon, where epsilon is the required accuracy. So computations can run 100 times more quickly when two digits of accuracy are required. The multilevel philosophy has since been adopted by a range of researchers and a wealth of practically significant results has arisen, most of which have yet to make their way into the expository literature. In this work, we give a brief, accessible, introduction to multilevel Monte Carlo and summarize recent results applicable to the task of option evaluation.
Brown, Jill Harris
2007-01-01
Every year, the Parent-Teacher Association of Ferndale Elementary School in Atlanta, Georgia sponsors a fun road race for the students, teachers, families, and community. This annual event has inspired the author to develop the Running and Art project to show off her students' art and squeeze in a little art history, too. In this article, the…
Optimizing Running Performance.
Widule, Carol J.
1989-01-01
The optimization of step length and step rate (frequency) is essential for sprinters. This article analyzes data that compare step rate and step length to height, as a function of running speed, for ten elite runners. How results of such analyses can be used in training runners is also discussed. (IAH)
Monte Carlo simulation of neutron scattering instruments
Energy Technology Data Exchange (ETDEWEB)
Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.
1998-12-01
A code package consisting of the Monte Carlo Library MCLIB, the executing code MC{_}RUN, the web application MC{_}Web, and various ancillary codes is proposed as an open standard for simulation of neutron scattering instruments. The architecture of the package includes structures to define surfaces, regions, and optical elements contained in regions. A particle is defined by its vector position and velocity, its time of flight, its mass and charge, and a polarization vector. The MC{_}RUN code handles neutron transport and bookkeeping, while the action on the neutron within any region is computed using algorithms that may be deterministic, probabilistic, or a combination. Complete versatility is possible because the existing library may be supplemented by any procedures a user is able to code. Some examples are shown.
Does Addiction Run in Families?
... Addiction? » Does Addiction Run in Families? Does Addiction Run in Families? Listen PDF: EasyToRead_WhatIsAddiction_Final_ ... English Español "Heart disease runs in some families. Addiction runs in ours." ©istock.com/ Antonio_Diaz Matt's ...
The Running Gravitational Couplings
Dou, Djamel; Percacci, Roberto
1997-01-01
We compute the running of the cosmological constant and Newton's constant taking into account the effect of quantum fields with any spin between 0 and 2. We find that Newton's constant does not vary appreciably but the cosmological constant can change by many orders of magnitude when one goes from cosmological scales to typical elementary particle scales. In the extreme infrared, zero modes drive the cosmological constant to zero.
Results on CASTOR Performance during LHC Run 2
CMS Collaboration
2016-01-01
The intercalibration of the gains of the fine mesh PMT's using beam-halo muons is discussed, this in combination with results of a study on the noise and baseline. Two methods on obtaining gain correction factors for reweighing the gains between different high voltage settings are compared. Results on the efficiency of a CASTOR jet trigger are compared for LHC Run 2 collision data and Monte Carlo event generator predictions.
Chemical application of diffusion quantum Monte Carlo
Reynolds, P. J.; Lester, W. A., Jr.
1983-10-01
The diffusion quantum Monte Carlo (QMC) method gives a stochastic solution to the Schroedinger equation. As an example the singlet-triplet splitting of the energy of the methylene molecule CH2 is given. The QMC algorithm was implemented on the CYBER 205, first as a direct transcription of the algorithm running on our VAX 11/780, and second by explicitly writing vector code for all loops longer than a crossover length C. The speed of the codes relative to one another as a function of C, and relative to the VAX is discussed. Since CH2 has only eight electrons, most of the loops in this application are fairly short. The longest inner loops run over the set of atomic basis functions. The CPU time dependence obtained versus the number of basis functions is discussed and compared with that obtained from traditional quantum chemistry codes and that obtained from traditional computer architectures. Finally, preliminary work on restructuring the algorithm to compute the separate Monte Carlo realizations in parallel is discussed.
Nixon, Robin
2010-01-01
Ubuntu for everyone! This popular Linux-based operating system is perfect for people with little technical background. It's simple to install, and easy to use -- with a strong focus on security. Ubuntu: Up and Running shows you the ins and outs of this system with a complete hands-on tour. You'll learn how Ubuntu works, how to quickly configure and maintain Ubuntu 10.04, and how to use this unique operating system for networking, business, and home entertainment. This book includes a DVD with the complete Ubuntu system and several specialized editions -- including the Mythbuntu multimedia re
Directory of Open Access Journals (Sweden)
Cecilia Maya
2004-12-01
Full Text Available El método Monte Carlo se aplica a varios casos de valoración de opciones financieras. El método genera una buena aproximación al comparar su precisión con la de otros métodos numéricos. La estimación que produce la versión Cruda de Monte Carlo puede ser aún más exacta si se recurre a metodologías de reducción de la varianza entre las cuales se sugieren la variable antitética y de la variable de control. Sin embargo, dichas metodologías requieren un esfuerzo computacional mayor por lo cual las mismas deben ser evaluadas en términos no sólo de su precisión sino también de su eficiencia.
Monte Carlo and nonlinearities
Dauchet, Jérémi; Blanco, Stéphane; Caliot, Cyril; Charon, Julien; Coustet, Christophe; Hafi, Mouna El; Eymet, Vincent; Farges, Olivier; Forest, Vincent; Fournier, Richard; Galtier, Mathieu; Gautrais, Jacques; Khuong, Anaïs; Pelissier, Lionel; Piaud, Benjamin; Roger, Maxime; Terrée, Guillaume; Weitz, Sebastian
2016-01-01
The Monte Carlo method is widely used to numerically predict systems behaviour. However, its powerful incremental design assumes a strong premise which has severely limited application so far: the estimation process must combine linearly over dimensions. Here we show that this premise can be alleviated by projecting nonlinearities on a polynomial basis and increasing the configuration-space dimension. Considering phytoplankton growth in light-limited environments, radiative transfer in planetary atmospheres, electromagnetic scattering by particles and concentrated-solar-power-plant productions, we prove the real world usability of this advance on four test-cases that were so far regarded as impracticable by Monte Carlo approaches. We also illustrate an outstanding feature of our method when applied to sharp problems with interacting particles: handling rare events is now straightforward. Overall, our extension preserves the features that made the method popular: addressing nonlinearities does not compromise o...
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.
CosmoMC Installation and Running Guidelines
Li, Ming-Hua
2014-01-01
CosmoMC is a Fortran 95 Markov-Chain Monte-Carlo (MCMC) engine to explore the cosmological parameter space, plus a Python suite for plotting and presenting results (see http://cosmologist.info/cosmomc/). This document describes the installation of the CosmoMC on a Linux system (Ubuntu 14.04.1 LTS 64-bit version). It is written for those who want to use it in their scientific research but without much training on Linux and the program. Besides a step-by-step installation guide, we also give a brief introduction of how to run the program on both a desktop and a cluster. We share our way to generate the plots that are commonly used in the references of cosmology. For more information, one can refer to the CosmoCoffee forum (http://cosmocoffee.info/viewforum.php?f=11) or contact the authors of this document. Questions and comments would be much appreciated.
Running cosmological constant with observational tests
Geng, Chao-Qiang; Zhang, Kaituo
2016-01-01
We investigate the running cosmological constant model with dark energy linearly proportional to the Hubble parameter, $\\Lambda = \\sigma H + \\Lambda_0$, in which the $\\Lambda$CDM limit is recovered by taking $\\sigma=0$. We derive the linear perturbation equations of gravity under the Friedmann-Lema\\"itre-Robertson-Walker cosmology, and show the power spectra of the CMB temperature and matter density distribution. By using the Markov chain Monte Carlo method, we fit the model to the current observational data and find that $\\sigma H_0/ \\Lambda_0 \\lesssim 2.63 \\times 10^{-2}$ and $6.74 \\times 10^{-2}$ for $\\Lambda(t)$ coupled to matter and radiation-matter, respectively, along with constraints on other cosmological parameters.
Running cosmological constant with observational tests
Directory of Open Access Journals (Sweden)
Chao-Qiang Geng
2016-09-01
Full Text Available We investigate the running cosmological constant model with dark energy linearly proportional to the Hubble parameter, Λ=σH+Λ0, in which the ΛCDM limit is recovered by taking σ=0. We derive the linear perturbation equations of gravity under the Friedmann–Lemaïtre–Robertson–Walker cosmology, and show the power spectra of the CMB temperature and matter density distribution. By using the Markov chain Monte Carlo method, we fit the model to the current observational data and find that σH0/Λ0≲2.63×10−2 and 6.74×10−2 for Λ(t coupled to matter and radiation-matter, respectively, along with constraints on other cosmological parameters.
ATLAS Distributed Computing in LHC Run2
Campana, Simone; The ATLAS collaboration
2015-01-01
The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run2. An increased data rate and computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (ProdSys2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward the flexible computing model. The flexible computing utilization exploring the opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model, the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover a new data management strategy, based on defined lifetime for each dataset, has been defin...
Running cosmological constant with observational tests
Geng, Chao-Qiang; Lee, Chung-Chi; Zhang, Kaituo
2016-09-01
We investigate the running cosmological constant model with dark energy linearly proportional to the Hubble parameter, Λ = σH +Λ0, in which the ΛCDM limit is recovered by taking σ = 0. We derive the linear perturbation equations of gravity under the Friedmann-Lemaïtre-Robertson-Walker cosmology, and show the power spectra of the CMB temperature and matter density distribution. By using the Markov chain Monte Carlo method, we fit the model to the current observational data and find that σH0 /Λ0 ≲ 2.63 ×10-2 and 6.74 ×10-2 for Λ (t) coupled to matter and radiation-matter, respectively, along with constraints on other cosmological parameters.
Claudia Marcelloni de Oliveira; Pauline Gagnon
It must be all the training we are getting every day, running around trying to get everything ready for the start of the LHC next year. This year, the ATLAS runners were in fine form and came in force. Nine ATLAS teams signed up for the 37th Annual CERN Relay Race with six runners per team. Under a blasting sun on Wednesday 23rd May 2007, each team covered the distances of 1000m, 800m, 800m, 500m, 500m and 300m taking the runners around the whole Meyrin site, hills included. A small reception took place in the ATLAS secretariat a week later to award the ATLAS Cup to the best ATLAS team. For the details on this complex calculation which takes into account the age of each runner, their gender and the color of their shoes, see the July 2006 issue of ATLAS e-news. The ATLAS Running Athena Team, the only all-women team enrolled this year, won the much coveted ATLAS Cup for the second year in a row. In fact, they are so good that Peter Schmid and Patrick Fassnacht are wondering about reducing the women's bonus in...
Energy Technology Data Exchange (ETDEWEB)
1981-09-01
PDU Run 10, a 46-day H-Coal syncrude mode operation using Wyodak coal, successfully met all targeted objectives, and was the longest PDU operation to date in this program. Targeted coal conversion of 90 W % was exceeded with a C/sub 4/-975/sup 0/F distillate yield of 43 to 48 W %. Amocat 1A catalyst was qualified for Pilot Plant operation based on improved operation and superior performance. PDU 10 achieved improved yields and lower hydrogen consumption compared to PDU 6, a similar operation. High hydroclone efficiency and high solids content in the vacuum still were maintained throughout the run. Steady operations at lower oil/solids ratios were demonstrated. Microautoclave testing was introduced as an operational aid. Four additional studies were successfully completed during PDU 10. These included a catalyst tracer study in conjunction with Sandia Laboratories; tests on letdown valve trims for Battelle; a fluid dynamics study with Amoco; and special high-pressure liquid sampling.
Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments
Energy Technology Data Exchange (ETDEWEB)
Pevey, Ronald E.
2005-09-15
Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.
Barefoot running: biomechanics and implications for running injuries.
Altman, Allison R; Davis, Irene S
2012-01-01
Despite the technological developments in modern running footwear, up to 79% of runners today get injured in a given year. As we evolved barefoot, examining this mode of running is insightful. Barefoot running encourages a forefoot strike pattern that is associated with a reduction in impact loading and stride length. Studies have shown a reduction in injuries to shod forefoot strikers as compared with rearfoot strikers. In addition to a forefoot strike pattern, barefoot running also affords the runner increased sensory feedback from the foot-ground contact, as well as increased energy storage in the arch. Minimal footwear is being used to mimic barefoot running, but it is not clear whether it truly does. The purpose of this article is to review current and past research on shod and barefoot/minimal footwear running and their implications for running injuries. Clearly more research is needed, and areas for future study are suggested.
2014-01-01
In the Netherlands, one-third of all fatalities and one-sixth of all seriously injured are the consequence of run-off-road crashes. The outcome of run-off-road crashes is relatively severe, one fatality in five seriously injured, which is twice the average in the Netherlands. Serious run-off-road cr
Seto, Craig K; Statuta, Siobhan M; Solari, Ian L
2010-07-01
As more children have become involved in athletic activities and running, there has been a significant increase in overuse injuries. The young athlete with open growth plates is vulnerable to unique overuse injuries involving the apophyses, articular cartilage, and growth plate. The physician caring for these young athletes needs to be aware of these conditions to diagnose and treat them appropriately. Physicians should also be aware of the risk of overtraining and overuse injury in athletes participating in year-round sports and competition. Current guidelines for overuse injury prevention in young athletes are primarily based on consensus and expert opinion. Further research is needed to provide evidence-based guidelines for overuse injury prevention in young athletes and runners. Copyright 2010 Elsevier Inc. All rights reserved.
Neutrino oscillation parameter sampling with MonteCUBES
Blennow, Mattias; Fernandez-Martinez, Enrique
2010-01-01
used in GLoBES [1,2]. Solution method: MonteCUBES is written as a plug-in to the GLoBES software [1,2] and provides the necessary methods to perform Markov Chain Monte Carlo sampling of the parameter space. This allows an efficient sampling of the parameter space and has a complexity which does not grow exponentially with the parameter space dimension. The integration of the MonteCUBES package with the GLoBES software makes sure that the experimental definitions already in use by the community can also be used with MonteCUBES, while also lowering the learning threshold for users who already know GLoBES. Additional comments: A Matlab GUI for interpretation of results is included in the distribution. Running time: The typical running time varies depending on the dimensionality of the parameter space, the complexity of the experiment, and how well the parameter space should be sampled. The running time for our simulations [3] with 15 free parameters at a Neutrino Factory with O(10) samples varied from a few hours to tens of hours. References:P. Huber, M. Lindner, W. Winter, Comput. Phys. Comm. 167 (2005) 195, hep-ph/0407333. P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Comm. 177 (2007) 432, hep-ph/0701187. S. Antusch, M. Blennow, E. Fernandez-Martinez, J. Lopez-Pavon, arXiv:0903.3986 [hep-ph].
Stability Criterion for Humanoid Running
Institute of Scientific and Technical Information of China (English)
LIZhao-Hui; HUANGQiang; LIKe-Jie
2005-01-01
A humanoid robot has high mobility but possibly risks of tipping over. Until now, one main topic on humanoid robots is to study the walking stability; the issue of the running stability has rarely been investigated. The running is different from the walking, and is more difficult to maintain its dynamic stability. The objective of this paper is to study the stability criterion for humanoid running based on the whole dynamics. First, the cycle and the dynamics of running are analyzed. Then, the stability criterion of humanoid running is presented. Finally, the effectiveness of the proposed stability criterion is illustrated by a dynamic simulation example using a dynamic analysis and design system (DADS).
Polarization Issues in Run 2008
Energy Technology Data Exchange (ETDEWEB)
Zhang,S.Y.; Ahrens, L.; Huang, H.; Zeno, K.
2008-07-01
The RHIC proton beam polarization has a strong dependence on intensity in Run 2008, whereas the dependence is almost absent in Run 2006. Meanwhile, the RHIC beam transverse emittance also has a dependence on intensity in Run 2008, but little in Run 2006. Using the emittance measurement at the AGS IPM and the BtA multiwires, the source of this difference between 2006 and 2008 runs is traced to the Booster. It is found that at least the degree of the vertical scraping in the Booster is different in 2006 and 2008. The effect of this scraping for the RHIC beam emittance and polarization is studied.
2012-01-01
The 5th edition of the "Monts Jura Jazz Festival" will take place at the Esplanade du Lac in Divonne-les-Bains, France on September 21 and 22. This festival organized by the CERN Jazz Club and supported by the CERN Staff Association is becoming a major musical event in the Geneva region. International Jazz artists like Didier Lockwood and David Reinhardt are part of this year outstanding program. Full program and e-tickets are available on the festival website. Don't miss this great festival!
Jazz Club
2012-01-01
The 5th edition of the "Monts Jura Jazz Festival" that will take place on September 21st and 22nd 2012 at the Esplanade du Lac in Divonne-les-Bains. This festival is organized by the "CERN Jazz Club" with the support of the "CERN Staff Association". This festival is a major musical event in the French/Swiss area and proposes a world class program with jazz artists such as D.Lockwood and D.Reinhardt. More information on http://www.jurajazz.com.
LMC: Logarithmantic Monte Carlo
Mantz, Adam B.
2017-06-01
LMC is a Markov Chain Monte Carlo engine in Python that implements adaptive Metropolis-Hastings and slice sampling, as well as the affine-invariant method of Goodman & Weare, in a flexible framework. It can be used for simple problems, but the main use case is problems where expensive likelihood evaluations are provided by less flexible third-party software, which benefit from parallelization across many nodes at the sampling level. The parallel/adaptive methods use communication through MPI, or alternatively by writing/reading files, and mostly follow the approaches pioneered by CosmoMC (ascl:1106.025).
Implict Monte Carlo Radiation Transport Simulations of Four Test Problems
Energy Technology Data Exchange (ETDEWEB)
Gentile, N
2007-08-01
Radiation transport codes, like almost all codes, are difficult to develop and debug. It is helpful to have small, easy to run test problems with known answers to use in development and debugging. It is also prudent to re-run test problems periodically during development to ensure that previous code capabilities have not been lost. We describe four radiation transport test problems with analytic or approximate analytic answers. These test problems are suitable for use in debugging and testing radiation transport codes. We also give results of simulations of these test problems performed with an Implicit Monte Carlo photonics code.
ATLAS simulation of boson plus jets processes in Run 2
The ATLAS collaboration
2017-01-01
This note describes the ATLAS simulation setup used to model the production of single electroweak bosons ($W$, $Z\\gamma^\\ast$ and prompt $\\gamma$) in association with jets in proton--proton collisions at centre-of-mass energies of 8 and 13 TeV. Several Monte Carlo generator predictions are compared in regions of phase space relevant for data analyses during the LHC Run-2, or compared to unfolded data distributions measured in previous Run-1 or early Run-2 ATLAS analyses. Comparisons are made for regions of phase space with or without additional requirements on the heavy-flavour content of the accompanying jets, as well as electroweak $Vjj$ production processes. Both higher-order corrections and systematic uncertainties are also discussed.
CMS Monte Carlo production in the WLCG computing grid
Hernández, J M; Mohapatra, A; Filippis, N D; Weirdt, S D; Hof, C; Wakefield, S; Guan, W; Khomitch, A; Fanfani, A; Evans, D; Flossdorf, A; Maes, J; van Mulders, P; Villella, I; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Caballero, J; Sanches, J A; Kavka, C; Van Lingen, F; Bacchi, W; Codispoti, G; Elmer, P; Eulisse, G; Lazaridis, C; Kalini, S; Sarkar, S; Hammad, G
2008-01-01
Monte Carlo production in CMS has received a major boost in performance and scale since the past CHEP06 conference. The production system has been re-engineered in order to incorporate the experience gained in running the previous system and to integrate production with the new CMS event data model, data management system and data processing framework. The system is interfaced to the two major computing Grids used by CMS, the LHC Computing Grid (LCG) and the Open Science Grid (OSG).
Deur, A; de Teramond, G F
2016-01-01
We review the present knowledge for $\\alpha_s$, the fundamental coupling underlying the interactions of quarks and gluons in QCD. The dependence of $\\alpha_s(Q^2)$ on momentum transfer $Q$ encodes the underlying dynamics of hadron physics -from color confinement in the infrared domain to asymptotic freedom at short distances. We review constraints on $\\alpha_s(Q^2)$ at high $Q^2$, as predicted by perturbative QCD, and its analytic behavior at small $Q^2$, based on models of nonperturbative dynamics. In the introductory part of this review, we explain the phenomenological meaning of $\\alpha_s$, the reason for its running, and the challenges facing a complete understanding of its analytic behavior in the infrared domain. In the second, more technical, part of the review, we discuss the behavior of $\\alpha_s(Q^2)$ in the high $Q^2$ domain of QCD. We review how $\\alpha_s$ is defined, including its renormalization scheme dependence, the definition of its renormalization scale, the utility of effective charges, as ...
Data libraries as a collaborative tool across Monte Carlo codes
Augelli, Mauro; Han, Mincheol; Hauf, Steffen; Kim, Chan-Hyeung; Kuster, Markus; Pia, Maria Grazia; Quintieri, Lina; Saracco, Paolo; Seo, Hee; Sudhakar, Manju; Eidenspointner, Georg; Zoglauer, Andreas
2010-01-01
The role of data libraries in Monte Carlo simulation is discussed. A number of data libraries currently in preparation are reviewed; their data are critically examined with respect to the state-of-the-art in the respective fields. Extensive tests with respect to experimental data have been performed for the validation of their content.
Energy Technology Data Exchange (ETDEWEB)
Marcus, Ryan C. [Los Alamos National Laboratory
2012-07-25
MCMini is a proof of concept that demonstrates the possibility for Monte Carlo neutron transport using OpenCL with a focus on performance. This implementation, written in C, shows that tracing particles and calculating reactions on a 3D mesh can be done in a highly scalable fashion. These results demonstrate a potential path forward for MCNP or other Monte Carlo codes.
Monte Carlo methods for electromagnetics
Sadiku, Matthew NO
2009-01-01
Until now, novices had to painstakingly dig through the literature to discover how to use Monte Carlo techniques for solving electromagnetic problems. Written by one of the foremost researchers in the field, Monte Carlo Methods for Electromagnetics provides a solid understanding of these methods and their applications in electromagnetic computation. Including much of his own work, the author brings together essential information from several different publications.Using a simple, clear writing style, the author begins with a historical background and review of electromagnetic theory. After addressing probability and statistics, he introduces the finite difference method as well as the fixed and floating random walk Monte Carlo methods. The text then applies the Exodus method to Laplace's and Poisson's equations and presents Monte Carlo techniques for handing Neumann problems. It also deals with whole field computation using the Markov chain, applies Monte Carlo methods to time-varying diffusion problems, and ...
Deur, Alexandre; Brodsky, Stanley J.; de Téramond, Guy F.
2016-09-01
We review the present theoretical and empirical knowledge for αs, the fundamental coupling underlying the interactions of quarks and gluons in Quantum Chromodynamics (QCD). The dependence of αs(Q2) on momentum transfer Q encodes the underlying dynamics of hadron physics-from color confinement in the infrared domain to asymptotic freedom at short distances. We review constraints on αs(Q2) at high Q2, as predicted by perturbative QCD, and its analytic behavior at small Q2, based on models of nonperturbative dynamics. In the introductory part of this review, we explain the phenomenological meaning of the coupling, the reason for its running, and the challenges facing a complete understanding of its analytic behavior in the infrared domain. In the second, more technical, part of the review, we discuss the behavior of αs(Q2) in the high momentum transfer domain of QCD. We review how αs is defined, including its renormalization scheme dependence, the definition of its renormalization scale, the utility of effective charges, as well as "Commensurate Scale Relations" which connect the various definitions of the QCD coupling without renormalization-scale ambiguity. We also report recent significant measurements and advanced theoretical analyses which have led to precise QCD predictions at high energy. As an example of an important optimization procedure, we discuss the "Principle of Maximum Conformality", which enhances QCD's predictive power by removing the dependence of the predictions for physical observables on the choice of theoretical conventions such as the renormalization scheme. In the last part of the review, we discuss the challenge of understanding the analytic behavior αs(Q2) in the low momentum transfer domain. We survey various theoretical models for the nonperturbative strongly coupled regime, such as the light-front holographic approach to QCD. This new framework predicts the form of the quark-confinement potential underlying hadron spectroscopy and
Oxygen cost of running barefoot vs. running shod.
Hanson, N J; Berg, K; Deka, P; Meendering, J R; Ryan, C
2011-06-01
The purpose of this study was to investigate the oxygen cost of running barefoot vs. running shod on the treadmill as well as overground. 10 healthy recreational runners, 5 male and 5 female, whose mean age was 23.8±3.39 volunteered to participate in the study. Subjects participated in 4 experimental conditions: 1) barefoot on treadmill, 2) shod on treadmill, 3) barefoot overground, and 4) shod overground. For each condition, subjects ran for 6 min at 70% vVO (2)max pace while VO (2), heart rate (HR), and rating of perceived exertion (RPE) were assessed. A 2 × 2 (shoe condition x surface) repeated measures ANOVA revealed that running with shoes showed significantly higher VO (2) values on both the treadmill and the overground track (pbarefoot. It was concluded that at 70% of vVO (2)max pace, barefoot running is more economical than running shod, both overground and on a treadmill.
CDF Run Ⅱ Run Control and Online Monitor
Institute of Scientific and Technical Information of China (English)
T.Arisawa; W.Badgett; 等
2001-01-01
In this paper,we discuss the CDF Run Ⅱ Run Control and online event monitoring system.Run Control is the top level application that controls the data acquisition activities across 150 front end VME crates and related service processes,Run Control is a real-time multi-threaded application implemented in Java with flexible state machines,using JDBC database connections to configure clients,and including a user friendly and powerful graphical user interface.The CDF online event monitoring system consists of several parts;the eent monitoring programs,the display to browse their results,the server program which communicates with the display via socket connections ,the error receiver which displays error messages and communicates with run Control,and the state manager which monitors the state of the monitor programs.
Information Geometry and Sequential Monte Carlo
Sim, Aaron; Stumpf, Michael P H
2012-01-01
This paper explores the application of methods from information geometry to the sequential Monte Carlo (SMC) sampler. In particular the Riemannian manifold Metropolis-adjusted Langevin algorithm (mMALA) is adapted for the transition kernels in SMC. Similar to its function in Markov chain Monte Carlo methods, the mMALA is a fully adaptable kernel which allows for efficient sampling of high-dimensional and highly correlated parameter spaces. We set up the theoretical framework for its use in SMC with a focus on the application to the problem of sequential Bayesian inference for dynamical systems as modelled by sets of ordinary differential equations. In addition, we argue that defining the sequence of distributions on geodesics optimises the effective sample sizes in the SMC run. We illustrate the application of the methodology by inferring the parameters of simulated Lotka-Volterra and Fitzhugh-Nagumo models. In particular we demonstrate that compared to employing a standard adaptive random walk kernel, the SM...
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.
2016-11-29
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations
Metropolis Methods for Quantum Monte Carlo Simulations
Ceperley, D. M.
2003-01-01
Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...
CMS Software and Computing: Ready for Run 2
Bloom, Kenneth
2015-01-01
In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.
CMS Software and Computing Ready for Run 2
Bloom, Kenneth Arthur
2015-01-01
In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.
Run-up Variability due to Source Effects
Del Giudice, Tania; Zolezzi, Francesca; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.
2010-05-01
This paper investigates the variability of tsunami run-up at a specific location due to uncertainty in earthquake source parameters. It is important to quantify this 'inter-event' variability for probabilistic assessments of tsunami hazard. In principal, this aspect of variability could be studied by comparing field observations at a single location from a number of tsunamigenic events caused by the same source. As such an extensive dataset does not exist, we decided to study the inter-event variability through numerical modelling. We attempt to answer the question 'What is the potential variability of tsunami wave run-up at a specific site, for a given magnitude earthquake occurring at a known location'. The uncertainty is expected to arise from the lack of knowledge regarding the specific details of the fault rupture 'source' parameters. The following steps were followed: the statistical distributions of the main earthquake source parameters affecting the tsunami height were established by studying fault plane solutions of known earthquakes; a case study based on a possible tsunami impact on Egypt coast has been set up and simulated, varying the geometrical parameters of the source; simulation results have been analyzed deriving relationships between run-up height and source parameters; using the derived relationships a Monte Carlo simulation has been performed in order to create the necessary dataset to investigate the inter-event variability of the run-up height along the coast; the inter-event variability of the run-up height along the coast has been investigated. Given the distribution of source parameters and their variability, we studied how this variability propagates to the run-up height, using the Cornell 'Multi-grid coupled Tsunami Model' (COMCOT). The case study was based on the large thrust faulting offshore the south-western Greek coast, thought to have been responsible for the infamous 1303 tsunami. Numerical modelling of the event was used to
Effects of marathon running on running economy and kinematics.
Kyröläinen, H; Pullinen, T; Candau, R; Avela, J; Huttunen, P; Komi, P V
2000-07-01
The present study was designed to investigate interactions between running economy and mechanics before, during, and after an individually run marathon. Seven experienced triathletes performed a 5-min submaximal running test on a treadmill at an individual constant marathon speed. Heart rate was monitored and the expired respiratory gas was analyzed. Blood samples were drawn to analyze serum creatine kinase activity (S-CK), skeletal troponin I (sTnI), and blood lactate (B-La). A video analysis was performed (200 frames x s(-1)) to investigate running mechanics. A kinematic arm was used to determine the external work of each subject. The results of the present study demonstrate that after the marathon, a standardized 5-min submaximal running test resulted in an increase in oxygen consumption, ventilation, and heart rate (P stride frequency and a similar decrease in stride length were observed (P < 0.01). These results demonstrate clearly that weakened running economy cannot be explained by changes in running mechanics. Therefore, it is suggested that the increased physiological loading is due to several mechanisms: increased utilization of fat as an energy substrate, increased demands of body temperature regulation, and possible muscle damage.
The ATLAS Trigger in Run-2: Design, Menu, and Performance
Vazquez Schroeder, Tamara; The ATLAS collaboration
2017-01-01
The ATLAS trigger system is composed of a hardware Level-1 trigger and a software-based high- level trigger. It has successfully operated during the first part of Run-2 (2015/2016) at the LHC at a centre-of-mass energy of 13 TeV. A comprehensive review of the ATLAS trigger design, menu, and performance in Run-2 is presented in these proceedings, as well as an overview of the intensive preparation towards the second part of Run-2 (2017/2018).
Lectures on Monte Carlo methods
Madras, Neal
2001-01-01
Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati
RUN TO RUN CONTROL OF TIME-PRESSURE DISPENSING SYSTEM
Institute of Scientific and Technical Information of China (English)
Zhao Yixiang; Li Hanxiong; Ding Han; Xiong Youlun
2004-01-01
In electronics packaging the time-pressure dispensing system is widely used to squeeze the adhesive fluid in a syringe onto boards or sub-strates with the pressurized air.However,complexity of the process,which includes the air-fluid coupling and the nonlinear uncertainties,makes it diffi-cult to have a consistent process per-formance.An integrated dispensing process model is first introduced and then its input-output regression rela-tionship is used to design a run to run control methodology for this process.The controller takes EWMA scheme and its stability region is given.Ex-perimental results verify the effective-ness of the proposed run to run control method for dispensing process.
Quantum Monte Carlo Endstation for Petascale Computing
Energy Technology Data Exchange (ETDEWEB)
Lubos Mitas
2011-01-26
published papers, 15 invited talks and lectures nationally and internationally. My former graduate student and postdoc Dr. Michal Bajdich, who was supported byt this grant, is currently a postdoc with ORNL in the group of Dr. F. Reboredo and Dr. P. Kent and is using the developed tools in a number of DOE projects. The QWalk package has become a truly important research tool used by the electronic structure community and has attracted several new developers in other research groups. Our tools use several types of correlated wavefunction approaches, variational, diffusion and reptation methods, large-scale optimization methods for wavefunctions and enables to calculate energy differences such as cohesion, electronic gaps, but also densities and other properties, using multiple runs one can obtain equations of state for given structures and beyond. Our codes use efficient numerical and Monte Carlo strategies (high accuracy numerical orbitals, multi-reference wave functions, highly accurate correlation factors, pairing orbitals, force biased and correlated sampling Monte Carlo), are robustly parallelized and enable to run on tens of thousands cores very efficiently. Our demonstration applications were focused on the challenging research problems in several fields of materials science such as transition metal solids. We note that our study of FeO solid was the first QMC calculation of transition metal oxides at high pressures.
Monte Carlo Production Management at CMS
Boudoul, G.; Pol, A; Srimanobhas, P; Vlimant, J R; Franzoni, Giovanni
2015-01-01
The analysis of the LHC data at the Compact Muon Solenoid (CMS) experiment requires the production of a large number of simulated events.During the runI of LHC (2010-2012), CMS has produced over 12 Billion simulated events,organized in approximately sixty different campaigns each emulating specific detector conditions and LHC running conditions (pile up).In order toaggregate the information needed for the configuration and prioritization of the events production,assure the book-keeping and of all the processing requests placed by the physics analysis groups,and to interface with the CMS production infrastructure,the web-based service Monte Carlo Management (McM) has been developed and put in production in 2012.McM is based on recent server infrastructure technology (CherryPy + java) and relies on a CouchDB database back-end.This contribution will coverthe one and half year of operational experience managing samples of simulated events for CMS,the evolution of its functionalitiesand the extension of its capabi...
Run-to-Run Control Strategy for Diabetes Management
2007-11-02
quite serious ( diabetic coma), and the long- term implications of varying glucose levels ( nephropathy , retinopathy, and other tissue damage ) have...Trial Re- search Group, \\The e ect of intensive treatment of diabetes on the development and progression of long{term complications in insulin{dependent...1 RUN-TO-RUN CONTROL STRATEGY FOR DIABETES MANAGEMENT F.J. Doyle III1, B. Srinivasan2, and D. Bonvin2 1Department of Chemical Engineering, University
Energy Technology Data Exchange (ETDEWEB)
Kieseler, Jan
2015-12-15
In this thesis, measurements of the production cross sections for top-quark pairs and the determination of the top-quark mass are presented. Dileptonic decays of top-quark pairs (t anti t) with two opposite-charged lepton (electron and muon) candidates in the final state are considered. The studied data samples are collected in proton-proton collisions at the CERN Large Hadron Collider with the CMS detector and correspond to integrated luminosities of 5.0 fb{sup -1} and 19.7 fb{sup -1} at center-of-mass energies of √(s) = 7 TeV and √(s) = 8 TeV, respectively. The cross sections, σ{sub t} {sub anti} {sub t}, are measured in the fiducial detector volume (visible phase space), defined by the kinematics of the top-quark decay products, and are extrapolated to the full phase space. The visible cross sections are extracted in a simultaneous binned-likelihood fit to multi-differential distributions of final-state observables, categorized according to the multiplicity of jets associated to b quarks (b jets) and other jets in each event. The fit is performed with emphasis on a consistent treatment of correlations between systematic uncertainties and taking into account features of the t anti t event topology. By comparison with predictions from the Standard Model at next-to-next-to leading order (NNLO) accuracy, the top-quark pole mass, m{sub t}{sup pole}, is extracted from the measured cross sections for different state-of-the-art PDF sets. Furthermore, the top-quark mass parameter used in Monte-Carlo simulations, m{sub t}{sup MC}, is determined using the distribution of the invariant mass of a lepton candidate and the leading b jet in the event, m{sub lb}. Being defined by the kinematics of the top-quark decay, this observable is unaffected by the description of the top-quark production mechanism. Events are selected from the data collected at √(s) = 8 TeV that contain at least two jets and one b jet in addition to the lepton candidate pair. A novel technique is
Cowell, Frank A
2014-12-01
I examine the idea of 'the long run' in Piketty (2014) and related works. In contrast to simplistic interpretations of long-run models of income- and wealth-distribution Piketty (2014) draws on a rich economic analysis that models the intra- and inter-generational processes that underly the development of the wealth distribution. These processes inevitably involve both market and non-market mechanisms. To understand this approach, and to isolate the impact of different social and economic factors on inequality in the long run, we use the concept of an equilibrium distribution. However the long-run analysis of policy should not presume that there is an inherent tendency for the wealth distribution to approach equilibrium.
Turkey Run Landfill Emissions Dataset
U.S. Environmental Protection Agency — landfill emissions measurements for the Turkey run landfill in Georgia. This dataset is associated with the following publication: De la Cruz, F., R. Green, G....
U.S. Environmental Protection Agency — Inputs and outputs for SHEDS-HT runs of DiNP, DEHP, DBP. This dataset is associated with the following publication: Moreau, M., J. Leonard, K. Phillips, J. Campbell,...
ATLAS Distributed Computing in LHC Run2
Campana, Simone
2015-12-01
The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run-2. An increase in both the data rate and the computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (Prodsys-2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward a flexible computing model. A flexible computing utilization exploring the use of opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model; the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover, a new data management strategy, based on a defined lifetime for each dataset, has been defined to better manage the lifecycle of the data. In this note, an overview of an operational experience of the new system and its evolution is presented.
PDF4LHC recommendations for LHC Run II
Butterworth, Jon; Cooper-Sarkar, Amanda; De Roeck, Albert; Feltesse, Joel; Forte, Stefano; Gao, Jun; Glazov, Sasha; Huston, Joey; Kassabov, Zahari; McNulty, Ronan; Morsch, Andreas; Nadolsky, Pavel; Radescu, Voica; Rojo, Juan; Thorne, Robert
2016-01-01
We provide an updated recommendation for the usage of sets of parton distribution functions (PDFs) and the assessment of PDF and PDF+$\\alpha_s$ uncertainties suitable for applications at the LHC Run II. We review developments since the previous PDF4LHC recommendation, and discuss and compare the new generation of PDFs, which include substantial information from experimental data from the Run I of the LHC. We then propose a new prescription for the combination of a suitable subset of the available PDF sets, which is presented in terms of a single combined PDF set. We finally discuss tools which allow for the delivery of this combined set in terms of optimized sets of Hessian eigenvectors or Monte Carlo replicas, and their usage, and provide some examples of their application to LHC phenomenology.
Aftalion, Amandine
2016-01-01
The aim of this paper is to bring a mathematical justification to the optimal way of organizing one's effort when running. It is well known from physiologists that all running exercises of duration less than 3mn are run with a strong initial acceleration and a decelerating end; on the contrary, long races are run with a final sprint. This can be explained using a mathematical model describing the evolution of the velocity, the anaerobic energy, and the propulsive force: a system of ordinary differential equations, based on Newton's second law and energy conservation, is coupled to the condition of optimizing the time to run a fixed distance. We show that the monotony of the velocity curve vs time is the opposite of that of the oxygen uptake (V O2) vs time. Since the oxygen uptake is monotone increasing for a short run, we prove that the velocity is exponentially increasing to its maximum and then decreasing. For longer races, the oxygen uptake has an increasing start and a decreasing end and this accounts for...
Effect Of Running Shoes on Foot Impact During Running
Nassif, Henry
2016-01-01
Running is part of almost every sport, and requires a great amount of stamina, endurance, mental toughness and overall strength. At every step, the foot experiences ground reaction forces necessary to support the motion of the body. With the advancements in shoe technology, running shoes have grown in popularity among runners, as well as non-runners, because they reduce the risk of injuries from the impact felt by the foot. The purpose of this report is to analyze the effect of running shoes on impact forces on the foot. This is achieved through the use of three force pads fixed at different locations on the foot The force measured by each sensor is then used to estimate the vertical ground reaction force, using the sensors' calibrations equations . Based on the ground reaction force, the effective mass corresponding to the momentum change occurring during the transient phase of the impact is estimated. The results show that running at 9 miles per hour without running shoes generates an effective mass of (14....
Running of the Running and Entropy Perturbations During Inflation
van de Bruck, Carsten
2016-01-01
In single field slow-roll inflation, one expects that the spectral index $n_s -1$ is first order in slow-roll parameters. Similarly, its running $\\alpha_s = dn_s/d \\log k$ and the running of the running $\\beta_s = d\\alpha_s/d \\log k$ are second and third order and therefore expected to be progressively smaller, and usually negative. Hence, such models of inflation are in considerable tension with a recent analysis hinting that $\\beta_s$ may actually be positive, and larger than $\\alpha_s$. Motivated by this, in this work we ask the question of what kinds of inflationary models may be useful in achieving such a hierarchy of runnings, particularly focusing on two--field models of inflation in which the late-time transfer of power from isocurvature to curvature modes allows for a much more diverse range of phenomenology. We calculate the runnings due to this effect and briefly apply our results to assessing the feasibility of finding $|\\beta_s| \\gtrsim |\\alpha_s|$ in some specific models.
LHCb computing in Run II and its evolution towards Run III
Falabella, Antonio
2016-01-01
his contribution reports on the experience of the LHCb computing team during LHC Run 2 and its preparation for Run 3. Furthermore a brief introduction on LHCbDIRAC, i.e. the tool to interface to the experiment distributed computing resources for its data processing and data management operations, is given. Run 2, which started in 2015, has already seen several changes in the data processing workflows of the experiment. Most notably the ability to align and calibrate the detector between two different stages of the data processing in the high level trigger farm, eliminating the need for a second pass processing of the data offline. In addition a fraction of the data is immediately reconstructed to its final physics format in the high level trigger and only this format is exported from the experiment site to the physics analysis. This concept have successfully been tested and will continue to be used for the rest of Run 2. Furthermore the distributed data processing has been improved with new concepts and techn...
A study of the XY model by the Monte Carlo method
Suranyi, Peter; Harten, Paul
1987-01-01
The massively parallel processor is used to perform Monte Carlo simulations for the two dimensional XY model on lattices of sizes up to 128 x 128. A parallel random number generator was constructed, finite size effects were studied, and run times were compared with those on a CRAY X-MP supercomputer.
Influence of the Lower Jaw Position on the Running Pattern.
Directory of Open Access Journals (Sweden)
Christian Maurer
Full Text Available The effects of manipulated dental occlusion on body posture has been investigated quite often and discussed controversially in the literature. Far less attention has been paid to the influence of dental occlusion position on human movement. If human movement was analysed, it was mostly while walking and not while running. This study was therefore designed to identify the effect of lower jaw positions on running behaviour according to different dental occlusion positions.Twenty healthy young recreational runners (mean age = 33.9±5.8 years participated in this study. Kinematic data were collected using an eight-camera Vicon motion capture system (VICON Motion Systems, Oxford, UK. Subjects were consecutively prepared with four different dental occlusion conditions in random order and performed five running trials per test condition on a level walkway with their preferred running shoes. Vector based pattern recognition methods, in particular cluster analysis and support vector machines (SVM were used for movement pattern identification.Subjects exhibited unique movement patterns leading to 18 clusters for the 20 subjects. No overall classification of the splint condition could be observed. Within individual subjects different running patterns could be identified for the four splint conditions. The splint conditions lead to a more symmetrical running pattern than the control condition.The influence of an occlusal splint on running pattern can be confirmed in this study. Wearing a splint increases the symmetry of the running pattern. A more symmetrical running pattern might help to reduce the risk of injuries or help in performance. The change of the movement pattern between the neutral condition and any of the three splint conditions was significant within subjects but not across subjects. Therefore the dental splint has a measureable influence on the running pattern of subjects, however subjects individuality has to be considered when choosing the
CMS muon system towards LHC Run 2 and beyond
AUTHOR|(CDS)2073611
2016-01-01
The CMS muon system has played a key role for many physics results obtained from the LHC Run 1 data. The LHC will increase the beam energy as well as progressively increase the peak instantaneous luminosity in Run 2 and in the following years. Significant consolidation and upgrade activities are ongoing, in order to improve the CMS muon detectors and trigger performance and robustness.With LHC and then HL-LHC running beyond 2030, the large accumulated radiation dose, the high pileup environment, and the ageing of several detector and electronics components become challenges that can only be met with further development and upgrade work.We will introduce the CMS muon system and present the consolidation work in preparation for LHC Run 2. We will then describe the main constraints and the solutions proposed for the upgrade of the muon detector system towards HL-LHC.
CMS muon system towards LHC Run 2 and beyond
Guiducci, Luigi
2014-01-01
The CMS muon system has played a key role for many physics results obtained from the LHC Run 1 data. The LHC will increase the beam energy as well as progressively increase the peak instantaneous luminosity in Run 2 and in the following years. Significant consolidation and upgrade activities are ongoing, in order to improve the CMS muon detectors and trigger performance and robustness.With LHC and then HL-LHC running beyond 2030, the large accumulated radiation dose, the high pileup environment, and the ageing of several detector and electronics components become challenges that can only be met with further development and upgrade work.We will introduce the CMS muon system and present the consolidation work in preparation for LHC Run 2. We will then describe the main constraints and the solutions proposed for the upgrade of the muon detector system towards HL-LHC.
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
Monte Carlo scatter correction for SPECT
Liu, Zemei
The goal of this dissertation is to present a quantitatively accurate and computationally fast scatter correction method that is robust and easily accessible for routine applications in SPECT imaging. A Monte Carlo based scatter estimation method is investigated and developed further. The Monte Carlo simulation program SIMIND (Simulating Medical Imaging Nuclear Detectors), was specifically developed to simulate clinical SPECT systems. The SIMIND scatter estimation (SSE) method was developed further using a multithreading technique to distribute the scatter estimation task across multiple threads running concurrently on multi-core CPU's to accelerate the scatter estimation process. An analytical collimator that ensures less noise was used during SSE. The research includes the addition to SIMIND of charge transport modeling in cadmium zinc telluride (CZT) detectors. Phenomena associated with radiation-induced charge transport including charge trapping, charge diffusion, charge sharing between neighboring detector pixels, as well as uncertainties in the detection process are addressed. Experimental measurements and simulation studies were designed for scintillation crystal based SPECT and CZT based SPECT systems to verify and evaluate the expanded SSE method. Jaszczak Deluxe and Anthropomorphic Torso Phantoms (Data Spectrum Corporation, Hillsborough, NC, USA) were used for experimental measurements and digital versions of the same phantoms employed during simulations to mimic experimental acquisitions. This study design enabled easy comparison of experimental and simulated data. The results have consistently shown that the SSE method performed similarly or better than the triple energy window (TEW) and effective scatter source estimation (ESSE) methods for experiments on all the clinical SPECT systems. The SSE method is proven to be a viable method for scatter estimation for routine clinical use.
Preparing ZEUS-2 for Observing Run at the APEX Telescope
Dahlin, Patrick; Vishwas, Amit; Nikola, Thomas; Stacey, Gordon J.
2017-01-01
ZEUS-2 is a direct detection grating spectrometer that was designed to maximize sensitivity for the detection of the far-infrared fine-structure lines from distant star forming galaxies as they are redshifted into the short submillimeter windows. ZEUS-2 employs two NIST TES bolometer arrays as its detector: one tuned to 400 μm and the other that consists of two sub-arrays, one tuned to 215 μm and the other tuned to 645 μm. Therefore, by placing bandpass filters directly above the detector ZEUS-2 can address four telluric windows (200 μm, 350 μm, 450 μm, and 650 μm) simultaneously on extended objects, and two windows (200 and 650 μm, or 350 and 450 μm) simultaneously on point sources. ZEUS-2 has now been deployed four times on the APEX telescope in Chile and demonstrated background limited performance both at 350 and 450 μm. As part of my NSF REU experience at Cornell in the summer of 2016, I helped with testing of ZEUS-2 in the lab and improving components for its use on the telescope. This poster will cover the principles of the ZEUS-2 instrument and some of the recent scientific results.
The ATLAS collaboration
2015-01-01
The reconstruction algorithm, energy calibration, and identification methods for hadronically decaying tau leptons in ATLAS used at the start of Run-2 of the Large Hadron Collider are described in this note. All algorithms have been optimised for Run-2 conditions. The energy calibration relies on Monte Carlo samples with hadronic tau lepton decays, and applies multiplicative factors based on the pT of the reconstructed tau lepton to the energy measurements in the calorimeters. The identification employs boosted decision trees. Systematic uncertainties on the energy scale, reconstruction efficiency and identification efficiency of hadronically decaying tau leptons are determined using Monte Carlo samples that simulate varying conditions.
Energetics of bipedal running. II. Limb design and running mechanics.
Roberts, T J; Chen, M S; Taylor, C R
1998-10-01
Compared with quadrupeds, bipedal runners of the same weight have longer legs, take longer steps and can presumably use slower, more economical muscle fibers. One might predict that bipedal running is less expensive, but it is not. We hypothesized that bipeds recruit a larger volume of muscle to support their weight, eliminating the potential economy of longer legs and slower steps. To test our hypothesis, we calculated the relative volume of muscle needed to support body weight over a stride in small dogs (Canis familiaris) and wild turkeys (Meleagris gallopavo) of the same weight. First, we confirmed that turkeys and dogs use approximately the same amount of energy to run at the same speed, and found that turkeys take 1. 8-fold longer steps. Higher muscle forces and/or longer muscle fibers would require a greater volume of active muscle, since muscle volume is proportional to the product of force and fascicle length. We measured both mean fascicle length and mean mechanical advantage for limb extensor muscles. Turkeys generated approximately the same total muscle force to support their weight during running and used muscle fascicles that are on average 2.1 times as long as in dogs, thus requiring a 2.5-fold greater active muscle volume. The greater volume appears to offset the economy of slower rates of force generation, supporting our hypothesis and providing a simple explanation for why it costs the same to run on two and four legs.
How Fast Can a Human Run? - Bipedal vs. Quadrupedal Running.
Kinugasa, Ryuta; Usami, Yoshiyuki
2016-01-01
Usain Bolt holds the current world record in the 100-m run, with a running time of 9.58 s, and has been described as the best human sprinter in history. However, this raises questions concerning the maximum human running speed, such as "Can the world's fastest men become faster still?" The correct answer is likely "Yes." We plotted the historical world records for bipedal and quadrupedal 100-m sprint times according to competition year. These historical records were plotted using several curve-fitting procedures. We found that the projected speeds intersected in 2048, when for the first time, the winning quadrupedal 100-m sprint time could be lower, at 9.276 s, than the winning bipedal time of 9.383 s. Video analysis revealed that in quadrupedal running, humans employed a transverse gallop with a small angular excursion. These results suggest that in the future, the fastest human on the planet might be a quadrupedal runner at the 2048 Olympics. This may be achieved by shifting up to the rotary gallop and taking longer strides with wide sagittal trunk motion.
Equilibrium Statistics: Monte Carlo Methods
Kröger, Martin
Monte Carlo methods use random numbers, or ‘random’ sequences, to sample from a known shape of a distribution, or to extract distribution by other means. and, in the context of this book, to (i) generate representative equilibrated samples prior being subjected to external fields, or (ii) evaluate high-dimensional integrals. Recipes for both topics, and some more general methods, are summarized in this chapter. It is important to realize, that Monte Carlo should be as artificial as possible to be efficient and elegant. Advanced Monte Carlo ‘moves’, required to optimize the speed of algorithms for a particular problem at hand, are outside the scope of this brief introduction. One particular modern example is the wavelet-accelerated MC sampling of polymer chains [406].
Morse Monte Carlo Radiation Transport Code System
Energy Technology Data Exchange (ETDEWEB)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)
[Stress fracture after changing to barefoot running].
Christensen, Mikkel
2014-12-15
Barefoot running is increasing in popularity but little is known about the implications in respect to injuries. It has been proposed that barefoot running is associated with a decrease in running injuries as it represents a more natural way of running. A 50-year-old runner with a weekly running distance of 50 km presented suffering from a stress fracture of the second metatarsal after six weeks of intensive barefoot running.
Monte Carlo Hamiltonian: Linear Potentials
Institute of Scientific and Technical Information of China (English)
LUO Xiang-Qian; LIU Jin-Jiang; HUANG Chun-Qing; JIANG Jun-Qin; Helmut KROGER
2002-01-01
We further study the validity of the Monte Carlo Hamiltonian method. The advantage of the method,in comparison with the standard Monte Carlo Lagrangian approach, is its capability to study the excited states. Weconsider two quantum mechanical models: a symmetric one V(x) = |x|/2; and an asymmetric one V(x) = ∞, forx ＜ 0 and V(x) = x, for x ≥ 0. The results for the spectrum, wave functions and thermodynamical observables are inagreement with the analytical or Runge-Kutta calculations.
Proton Upset Monte Carlo Simulation
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
AVATAR -- Automatic variance reduction in Monte Carlo calculations
Energy Technology Data Exchange (ETDEWEB)
Van Riper, K.A.; Urbatsch, T.J.; Soran, P.D. [and others
1997-05-01
AVATAR{trademark} (Automatic Variance And Time of Analysis Reduction), accessed through the graphical user interface application, Justine{trademark}, is a superset of MCNP{trademark} that automatically invokes THREEDANT{trademark} for a three-dimensional deterministic adjoint calculation on a mesh independent of the Monte Carlo geometry, calculates weight windows, and runs MCNP. Computational efficiency increases by a factor of 2 to 5 for a three-detector oil well logging tool model. Human efficiency increases dramatically, since AVATAR eliminates the need for deep intuition and hours of tedious handwork.
Monte Carlo Simulation for the MAGIC-II System
Carmona, E; Moralejo, A; Vitale, V; Sobczynska, D; Haffke, M; Bigongiari, C; Otte, N; Cabras, G; De Maria, M; De Sabata, F
2007-01-01
Within the year 2007, MAGIC will be upgraded to a two telescope system at La Palma. Its main goal is to improve the sensitivity in the stereoscopic/coincident operational mode. At the same time it will lower the analysis threshold of the currently running single MAGIC telescope. Results from the Monte Carlo simulations of this system will be discussed. A comparison of the two telescope system with the performance of one single telescope will be shown in terms of sensitivity, angular resolution and energy resolution.
DEFF Research Database (Denmark)
Frigaard, Peter; Kofoed, Jens Peter; Schlütter, F.
The objective of the workshop was a comparison between the prototype and the laboratory measurements. the emphasis is put on comparison between recorded run-up levels. Three enclosed reports present measurements and results from University of Ghent (UG)/ FCCD, Flanders Hydraulics (FH) and Aalborg...
Running and Breathing in Mammals
Bramble, Dennis M.; Carrier, David R.
1983-01-01
Mechanical constraints appear to require that locomotion and breathing be synchronized in running mammals. Phase locking of limb and respiratory frequency has now been recorded during treadmill running in jackrabbits and during locomotion on solid ground in dogs, horses, and humans. Quadrupedal species normally synchronize the locomotor and respiratory cycles at a constant ratio of 1:1 (strides per breath) in both the trot and gallop. Human runners differ from quadrupeds in that while running they employ several phase-locked patterns (4:1, 3:1, 2:1, 1:1, 5:2, and 3:2), although a 2:1 coupling ratio appears to be favored. Even though the evolution of bipedal gait has reduced the mechanical constraints on respiration in man, thereby permitting greater flexibility in breathing pattern, it has seemingly not eliminated the need for the synchronization of respiration and body motion during sustained running. Flying birds have independently achieved phase-locked locomotor and respiratory cycles. This hints that strict locomotor-respiratory coupling may be a vital factor in the sustained aerobic exercise of endothermic vertebrates, especially those in which the stresses of locomotion tend to deform the thoracic complex.
Kagan, Michael; The ATLAS collaboration
2015-01-01
Title: Searches for di-Higgs production in 4b final states and new phenomena with boosted Higgs using the ATLAS detector at LHC Run I Abstract : Measurement of Higgs boson pair production has a fundamental importance in understanding the nature of the Higgs boson and electroweak symmetry breaking. TeVscale resonances decaying to a pair of Higgs boson are also predicted in various extensions of the Standard Models, e.g, Kaluza-Klein excitation of the gravitons in the bulk Randall- Sundrum extra dimensions, heavy scalar particles in two-Higgs-doublet models. This talk highlights ATLAS Run I searches for di-Higgs production in 4b final states with resolved topology using small-radius jets and boosted topology using large-radius jets with associated b-tagged track-jets. Other Run I searches employing techniques to identify boosted Higgs bosons are also presented in this talk. Title: Searches for vector-like quarks and resonances decaying into top-quarks with the ATLAS detector at LHC Run I Abstract : In theories ...
Running gratings in photoconductive materials
DEFF Research Database (Denmark)
Kukhtarev, N. V.; Kukhtareva, T.; Lyuksyutov, S. F.
2005-01-01
Starting from the three-dimensional version of a standard photorefractive model (STPM), we obtain a reduced compact Set of equations for an electric field based on the assumption of a quasi-steady-state fast recombination. The equations are suitable for evaluation of a current induced by running...
Usage of burnt fuel isotopic compositions from engineering codes in Monte-Carlo code calculations
Energy Technology Data Exchange (ETDEWEB)
Aleshin, Sergey S.; Gorodkov, Sergey S.; Shcherenko, Anna I. [Nuclear Research Centre ' ' Kurchatov Institute' ' , Moscow (Russian Federation)
2015-09-15
A burn-up calculation of VVER's cores by Monte-Carlo code is complex process and requires large computational costs. This fact makes Monte-Carlo codes usage complicated for project and operating calculations. Previously prepared isotopic compositions are proposed to use for the Monte-Carlo code (MCU) calculations of different states of VVER's core with burnt fuel. Isotopic compositions are proposed to calculate by an approximation method. The approximation method is based on usage of a spectral functionality and reference isotopic compositions, that are calculated by engineering codes (TVS-M, PERMAK-A). The multiplication factors and power distributions of FA and VVER with infinite height are calculated in this work by the Monte-Carlo code MCU using earlier prepared isotopic compositions. The MCU calculation data were compared with the data which were obtained by engineering codes.
Run-to-run product quality control of batch processes
Institute of Scientific and Technical Information of China (English)
JIA Li; SHI Ji-ping; CHENG Da-shuai; CHIU Min-sen
2009-01-01
Batch processes have been increasingly used in the production of low volume and high value added products.Consequently,optimization control in batch processes is crucial in order to derive the maximum benefit.In this paper,a run-to-run product quality control based on iterative learning optimization control is developed.Moreover,a rigorous theorem is proposed and proven in this paper,which states that the tracking error under the optimal iterative learning control (ILC) law can converge to zero.In this paper,a typical nonlinear batch continuous stirred tank reactor (CSTR) is considered,and the results show that the performance of trajectory tracking is gradually improved by the ILC.
Monte Carlo Particle Lists: MCPL
Kittelmann, Thomas; Knudsen, Erik B; Willendrup, Peter; Cai, Xiao Xiao; Kanaki, Kalliopi
2016-01-01
A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.
Running free: embracing a healthy lifestyle through distance running.
Shipway, Richard; Holloway, Immy
2010-11-01
Sport and leisure activity contribute to both health and quality of life. There is a dearth of qualitative studies on the lived experiences of active people, so the aim of this paper is to develop a deeper understanding of the experiences of one particular group of active leisure participants, distance runners, and to highlight the associated health and well-being benefits that result from participating in this increasingly popular form of active leisure. In doing so, this paper will briefly explore the potential opportunities and implications for sport and leisure policy and provision, and highlight examples of how distance running could positively contribute towards government objectives linked to tackling obesity levels, healthy living and physical well-being. It is suggested that similar benefits also exist across other forms of physical activity, exercise and sport. Qualitative methods of enquiry were adopted to understand the nature of the social world of long distance runners through interviews and observations, which were thematically analyzed. One of the key themes emerging from the data was the desire to embrace a healthy lifestyle, which then led to the emergence of four main sub-themes. The first was linked to the importance of seeking self-esteem and confirmation through running; second, an investigation of a selection of negative aspects associated with exercise addiction; third, the need to exercise among sport and leisure participants; and finally, an understanding of the concept of the 'running body'. Cautionary notes also identified negative aspects associated with exercise and physical activity. The findings highlight the potential role that distance running can play as an easily accessible and enjoyable leisure activity, one that can help facilitate increased participation in exercise and physical activity as an integral part of an active and healthy lifestyle.
Effects of running velocity on running kinetics and kinematics.
Brughelli, Matt; Cronin, John; Chaouachi, Anis
2011-04-01
Sixteen semiprofessional Australian football players performed running bouts at incremental velocities of 40, 60, 80, and 100% of their maximum velocity on a Woodway nonmotorized force treadmill. As running velocity increased from 40 to 60%, peak vertical and peak horizontal forces increased by 14.3% (effect size [ES] = 1.0) and 34.4% (ES = 4.2), respectively. The changes in peak vertical and peak horizontal forces from 60 to 80% were 1.0% (ES = 0.05) and 21.0% (ES = 2.9), respectively. Finally, the changes in peak vertical and peak horizontal forces from 80% to maximum were 2.0% (ES = 0.1) and 24.3% (ES = 3.4). In addition, both stride frequency and stride length significantly increased with each incremental velocity (p velocity (p velocity (r = 0.47). For the kinematic variables, only stride length was found to have a significant positive correlation with maximum running velocity (r = 0.66). It would seem that increasing maximal sprint velocity may be more dependent on horizontal force production as opposed to vertical force production.
Applications of Monte Carlo Methods in Calculus.
Gordon, Sheldon P.; Gordon, Florence S.
1990-01-01
Discusses the application of probabilistic ideas, especially Monte Carlo simulation, to calculus. Describes some applications using the Monte Carlo method: Riemann sums; maximizing and minimizing a function; mean value theorems; and testing conjectures. (YP)
Parallel Monte Carlo Simulation of Aerosol Dynamics
Directory of Open Access Journals (Sweden)
Kun Zhou
2014-02-01
Full Text Available A highly efficient Monte Carlo (MC algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process. Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI. The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles.
Parallel Monte Carlo simulation of aerosol dynamics
Zhou, K.
2014-01-01
A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.
The ATLAS collaboration
2016-01-01
The ATLAS RunTimeTester is a job based software test system. The RunTimeTester runs jobs, and optional tests on the job outputs. Job and test results are reported via a web site. The system currently runs $\\approx$ 8000 jobs daily, and the web site receives $\\approx$ 25K hits a week. This note provides an overview of the system.
Preventing Running Injuries through Barefoot Activity
Hart, Priscilla M.; Smith, Darla R.
2008-01-01
Running has become a very popular lifetime physical activity even though there are numerous reports of running injuries. Although common theories have pointed to impact forces and overpronation as the main contributors to chronic running injuries, the increased use of cushioning and orthotics has done little to decrease running injuries. A new…
Effect of Minimalist Footwear on Running Efficiency
Gillinov, Stephen M.; Laux, Sara; Kuivila, Thomas; Hass, Daniel; Joy, Susan M
2015-01-01
Background: Although minimalist footwear is increasingly popular among runners, claims that minimalist footwear enhances running biomechanics and efficiency are controversial. Hypothesis: Minimalist and barefoot conditions improve running efficiency when compared with traditional running shoes. Study Design: Randomized crossover trial. Level of Evidence: Level 3. Methods: Fifteen experienced runners each completed three 90-second running trials on a treadmill, each trial performed in a differ...
(U) Introduction to Monte Carlo Methods
Energy Technology Data Exchange (ETDEWEB)
Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-03-20
Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.
Piketty, Thomas; Saez, Emmanuel
2014-05-23
This Review presents basic facts regarding the long-run evolution of income and wealth inequality in Europe and the United States. Income and wealth inequality was very high a century ago, particularly in Europe, but dropped dramatically in the first half of the 20th century. Income inequality has surged back in the United States since the 1970s so that the United States is much more unequal than Europe today. We discuss possible interpretations and lessons for the future.
Running Servers around Zero Degrees
PervilÃ€, Mikko; Kangasharju, Jussi
2010-01-01
Data centers are a major consumer of electricity and a significant fraction of their energy use is devoted to cooling the data center. Recent prototype deployments have investigated the possibility of using outside air for cooling and have shown large potential savings in energy consumption. In this paper, we push this idea to the extreme, by running servers outside in Finnish winter. Our results show that commercial, off-the-shelf computer equipment can tolerate extreme conditions such as ou...
Energy Technology Data Exchange (ETDEWEB)
Southern Company Services, Inc.
2003-08-01
This report discusses test campaign TC06 of the Kellogg Brown & Root, Inc. (KBR) Transport Reactor train with a Siemens Westinghouse Power Corporation (Siemens Westinghouse) particle filter system at the Power Systems Development Facility (PSDF) located in Wilsonville, Alabama. The Transport Reactor is an advanced circulating fluidized-bed reactor designed to operate as either a combustor or a gasifier using a particulate control device (PCD). The Transport Reactor was operated as a pressurized gasifier during TC06. Test run TC06 was started on July 4, 2001, and completed on September 24, 2001, with an interruption in service between July 25, 2001, and August 19, 2001, due to a filter element failure in the PCD caused by abnormal operating conditions while tuning the main air compressor. The reactor temperature was varied between 1,725 and 1,825 F at pressures from 190 to 230 psig. In TC06, 1,214 hours of solid circulation and 1,025 hours of coal feed were attained with 797 hours of coal feed after the filter element failure. Both reactor and PCD operations were stable during the test run with a stable baseline pressure drop. Due to its length and stability, the TC06 test run provided valuable data necessary to analyze long-term reactor operations and to identify necessary modifications to improve equipment and process performance as well as progressing the goal of many thousands of hours of filter element exposure.
CERN Bulletin
2010-01-01
Last week, the Chamonix workshop once again proved its worth as a place where all the stakeholders in the LHC can come together, take difficult decisions and reach a consensus on important issues for the future of particle physics. The most important decision we reached last week is to run the LHC for 18 to 24 months at a collision energy of 7 TeV (3.5 TeV per beam). After that, we’ll go into a long shutdown in which we’ll do all the necessary work to allow us to reach the LHC’s design collision energy of 14 TeV for the next run. This means that when beams go back into the LHC later this month, we’ll be entering the longest phase of accelerator operation in CERN’s history, scheduled to take us into summer or autumn 2011. What led us to this conclusion? Firstly, the LHC is unlike any previous CERN machine. Because it is a cryogenic facility, each run is accompanied by lengthy cool-down and warm-up phases. For that reason, CERN’s traditional &...
Mike Lamont for the LHC Team
2011-01-01
The current LHC ion run has been progressing very well. The first fill with 358 bunches per beam - the maximum number for the year - was on Tuesday, 15 November and was followed by an extended period of steady running. The quality of the beam delivered by the heavy-ion injector chain has been excellent, and this is reflected in both the peak and the integrated luminosity. The peak luminosity in ATLAS reached 5x1026 cm-2s-1, which is a factor of ~16 more than last year's peak of 3x1025 cm-2s-1. The integrated luminosity in each of ALICE, ATLAS and CMS is now around 100 inverse microbarn, already comfortably over the nominal target for the run. The polarity of the ALICE spectrometer and solenoid magnets was reversed on Monday, 28 November with the aim of delivering another sizeable amount of luminosity in this configuration. On the whole, the LHC has been behaving very well recently, ensuring good machine availability. On Monday evening, however, a faulty level sensor in the cooling towe...
Understanding the T2 traffic in CMS during Run-1
T, Wildish
2015-01-01
In the run-up to Run-1 CMS was operating its facilities according to the MONARC model, where data-transfers were strictly hierarchical in nature. Direct transfers between Tier-2 nodes was excluded, being perceived as operationally intensive and risky in an era where the network was expected to be a major source of errors. By the end of Run-1 wide-area networks were more capable and stable than originally anticipated. The original data-placement model was relaxed, and traffic was allowed between Tier-2 nodes.Tier-2 to Tier-2 traffic in 2012 already exceeded the amount of Tier-2 to Tier-1 traffic, so it clearly has the potential to become important in the future. Moreover, while Tier-2 to Tier-1 traffic is mostly upload of Monte Carlo data, the Tier-2 to Tier-2 traffic represents data moved in direct response to requests from the physics analysis community. As such, problems or delays there are more likely to have a direct impact on the user community.Tier-2 to Tier-2 traffic may also traverse parts of the WAN ...
Prospects of LHC Higgs Physics at the end of Run III
Chen, Xin; The ATLAS collaboration
2017-01-01
The document is prepared for the LCWS2016 conference proceedings. The expected status of Higgs physics at the end of Run-3 is presented. The current Run-2 status is briefly reviewed, and the expected Higgs reach after the HL-LHC period is also summarized for some channels.
Geometric Templates for Improved Tracking Performance in Monte Carlo Codes
Nease, Brian R.; Millman, David L.; Griesheimer, David P.; Gill, Daniel F.
2014-06-01
One of the most fundamental parts of a Monte Carlo code is its geometry kernel. This kernel not only affects particle tracking (i.e., run-time performance), but also shapes how users will input models and collect results for later analyses. A new framework based on geometric templates is proposed that optimizes performance (in terms of tracking speed and memory usage) and simplifies user input for large scale models. While some aspects of this approach currently exist in different Monte Carlo codes, the optimization aspect has not been investigated or applied. If Monte Carlo codes are to be realistically used for full core analysis and design, this type of optimization will be necessary. This paper describes the new approach and the implementation of two template types in MC21: a repeated ellipse template and a box template. Several different models are tested to highlight the performance gains that can be achieved using these templates. Though the exact gains are naturally problem dependent, results show that runtime and memory usage can be significantly reduced when using templates, even as problems reach realistic model sizes.
Density matrix quantum Monte Carlo
Blunt, N S; Spencer, J S; Foulkes, W M C
2013-01-01
This paper describes a quantum Monte Carlo method capable of sampling the full density matrix of a many-particle system, thus granting access to arbitrary reduced density matrices and allowing expectation values of complicated non-local operators to be evaluated easily. The direct sampling of the density matrix also raises the possibility of calculating previously inaccessible entanglement measures. The algorithm closely resembles the recently introduced full configuration interaction quantum Monte Carlo method, but works all the way from infinite to zero temperature. We explain the theory underlying the method, describe the algorithm, and introduce an importance-sampling procedure to improve the stochastic efficiency. To demonstrate the potential of our approach, the energy and staggered magnetization of the isotropic antiferromagnetic Heisenberg model on small lattices and the concurrence of one-dimensional spin rings are compared to exact or well-established results. Finally, the nature of the sign problem...
Efficient kinetic Monte Carlo simulation
Schulze, Tim P.
2008-02-01
This paper concerns kinetic Monte Carlo (KMC) algorithms that have a single-event execution time independent of the system size. Two methods are presented—one that combines the use of inverted-list data structures with rejection Monte Carlo and a second that combines inverted lists with the Marsaglia-Norman-Cannon algorithm. The resulting algorithms apply to models with rates that are determined by the local environment but are otherwise arbitrary, time-dependent and spatially heterogeneous. While especially useful for crystal growth simulation, the algorithms are presented from the point of view that KMC is the numerical task of simulating a single realization of a Markov process, allowing application to a broad range of areas where heterogeneous random walks are the dominate simulation cost.
Monte Carlo modelling of TRIGA research reactor
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-10-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Ferrauti, Alexander; Bergermann, Matthias; Fernandez-Fernandez, Jaime
2010-10-01
The purpose of this study was to investigate the effects of a concurrent strength and endurance training program on running performance and running economy of middle-aged runners during their marathon preparation. Twenty-two (8 women and 14 men) recreational runners (mean ± SD: age 40.0 ± 11.7 years; body mass index 22.6 ± 2.1 kg·m⁻²) were separated into 2 groups (n = 11; combined endurance running and strength training program [ES]: 9 men, 2 women and endurance running [E]: 7 men, and 4 women). Both completed an 8-week intervention period that consisted of either endurance training (E: 276 ± 108 minute running per week) or a combined endurance and strength training program (ES: 240 ± 121-minute running plus 2 strength training sessions per week [120 minutes]). Strength training was focused on trunk (strength endurance program) and leg muscles (high-intensity program). Before and after the intervention, subjects completed an incremental treadmill run and maximal isometric strength tests. The initial values for VO2peak (ES: 52.0 ± 6.1 vs. E: 51.1 ± 7.5 ml·kg⁻¹·min⁻¹) and anaerobic threshold (ES: 3.5 ± 0.4 vs. E: 3.4 ± 0.5 m·s⁻¹) were identical in both groups. A significant time × intervention effect was found for maximal isometric force of knee extension (ES: from 4.6 ± 1.4 to 6.2 ± 1.0 N·kg⁻¹, p stride frequency also remained unchanged. The results suggest no benefits of an 8-week concurrent strength training for running economy and coordination of recreational marathon runners despite a clear improvement in leg strength, maybe because of an insufficient sample size or a short intervention period.
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
LHCb siliicon detectors: the Run 1 to Run 2 transition and first experience of Run 2
Rinnert, Kurt
2015-01-01
LHCb is a dedicated experiment to study New Physics in the decays of heavy hadrons at the Large Hadron Collider (LHC) at CERN. The detector includes a high precision tracking system consisting of a silicon-strip vertex detector (VELO) surrounding the pp interaction region, a large- area silicon-strip detector located upstream of a dipole magnet (TT), and three stations of silicon- strip detectors (IT) and straw drift tubes placed downstream (OT). The operational transition of the silicon detectors VELO, TT and IT from LHC Run 1 to Run 2 and first Run 2 experiences will be presented. During the long shutdown of the LHC the silicon detectors have been maintained in a safe state and operated regularly to validate changes in the control infrastructure, new operational procedures, updates to the alarm systems and monitoring software. In addition, there have been some infrastructure related challenges due to maintenance performed in the vicinity of the silicon detectors that will be discussed. The LHCb silicon dete...
ATLAS computing challenges before the next LHC run
Barberis, D; The ATLAS collaboration
2014-01-01
ATLAS software and computing is in a period of intensive evolution. The current long shutdown presents an opportunity to assimilate lessons from the very successful Run 1 (2009-2013) and to prepare for the substantially increased computing requirements for Run 2 (from spring 2015). Run 2 will bring a near doubling of the energy and the data rate, high event pile-up levels, and higher event complexity from detector upgrades, meaning the number and complexity of events to be analyzed will increase dramatically. At the same time operational loads must be reduced through greater automation, a wider array of opportunistic resources must be supported, costly storage must be used with greater efficiency, a sophisticated new analysis model must be integrated, and concurrency features of new processors must be exploited. This paper surveys the distributed computing aspects of the upgrade program and the plans for 2014 to exercise the new capabilities in a large scale Data Challenge.
Barefoot running: does it prevent injuries?
Murphy, Kelly; Curry, Emily J; Matzkin, Elizabeth G
2013-11-01
Endurance running has evolved over the course of millions of years and it is now one of the most popular sports today. However, the risk of stress injury in distance runners is high because of the repetitive ground impact forces exerted. These injuries are not only detrimental to the runner, but also place a burden on the medical community. Preventative measures are essential to decrease the risk of injury within the sport. Common running injuries include patellofemoral pain syndrome, tibial stress fractures, plantar fasciitis, and Achilles tendonitis. Barefoot running, as opposed to shod running (with shoes), has recently received significant attention in both the media and the market place for the potential to promote the healing process, increase performance, and decrease injury rates. However, there is controversy over the use of barefoot running to decrease the overall risk of injury secondary to individual differences in lower extremity alignment, gait patterns, and running biomechanics. While barefoot running may benefit certain types of individuals, differences in running stance and individual biomechanics may actually increase injury risk when transitioning to barefoot running. The purpose of this article is to review the currently available clinical evidence on barefoot running and its effectiveness for preventing injury in the runner. Based on a review of current literature, barefoot running is not a substantiated preventative running measure to reduce injury rates in runners. However, barefoot running utility should be assessed on an athlete-specific basis to determine whether barefoot running will be beneficial.
40 CFR 258.26 - Run-on/run-off control systems.
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Run-on/run-off control systems. 258.26... FOR MUNICIPAL SOLID WASTE LANDFILLS Operating Criteria § 258.26 Run-on/run-off control systems. (a) Owners or operators of all MSWLF units must design, construct, and maintain: (1) A run-on control system...
Fatigue associated with prolonged graded running.
Giandolini, Marlene; Vernillo, Gianluca; Samozino, Pierre; Horvais, Nicolas; Edwards, W Brent; Morin, Jean-Benoît; Millet, Guillaume Y
2016-10-01
Scientific experiments on running mainly consider level running. However, the magnitude and etiology of fatigue depend on the exercise under consideration, particularly the predominant type of contraction, which differs between level, uphill, and downhill running. The purpose of this review is to comprehensively summarize the neurophysiological and biomechanical changes due to fatigue in graded running. When comparing prolonged hilly running (i.e., a combination of uphill and downhill running) to level running, it is found that (1) the general shape of the neuromuscular fatigue-exercise duration curve as well as the etiology of fatigue in knee extensor and plantar flexor muscles are similar and (2) the biomechanical consequences are also relatively comparable, suggesting that duration rather than elevation changes affects neuromuscular function and running patterns. However, 'pure' uphill or downhill running has several fatigue-related intrinsic features compared with the level running. Downhill running induces severe lower limb tissue damage, indirectly evidenced by massive increases in plasma creatine kinase/myoglobin concentration or inflammatory markers. In addition, low-frequency fatigue (i.e., excitation-contraction coupling failure) is systematically observed after downhill running, although it has also been found in high-intensity uphill running for different reasons. Indeed, low-frequency fatigue in downhill running is attributed to mechanical stress at the interface sarcoplasmic reticulum/T-tubule, while the inorganic phosphate accumulation probably plays a central role in intense uphill running. Other fatigue-related specificities of graded running such as strategies to minimize the deleterious effects of downhill running on muscle function, the difference of energy cost versus heat storage or muscle activity changes in downhill, level, and uphill running are also discussed.
Ferrate, Andres
2010-01-01
Catch Google Wave, the revolutionary Internet protocol and web service that lets you communicate and collaborate in realtime. With this book, you'll understand how Google Wave integrates email, instant messaging (IM), wiki, and social networking functionality into a powerful and extensible platform. You'll also learn how to use its features, customize its functions, and build sophisticated extensions with Google Wave's open APIs and network protocol. Written for everyone -- from non-techies to ninja coders -- Google Wave: Up and Running provides a complete tour of this complex platform. You'
ATLAS Collaboration; The ATLAS collaboration
2016-01-01
While Standard Model is in a good shape especially after Higgs boson discovery, there are a lot of questions beyond SM. The ATLAS detector is performing about 50 Exotics searches addressed these questions. This talk is discussing some of them with datasets collected during the 2015-2016 LHC run from 3 fb^-1 to 18 fb^-1 of proton-proton collisions at 13 TeV centre of mass energy . Results on searches for resonances decaying into vector boson or fermions, for vector like quarks, for dark matter, and for other new phenomena using these data will be presented.
Pilgrim, Mark
2010-01-01
If you don't know about the new features available in HTML5, now's the time to find out. This book provides practical information about how and why the latest version of this markup language will significantly change the way you develop for the Web. HTML5 is still evolving, yet browsers such as Safari, Mozilla, Opera, and Chrome already support many of its features -- and mobile browsers are even farther ahead. HTML5: Up & Running carefully guides you though the important changes in this version with lots of hands-on examples, including markup, graphics, and screenshots. You'll learn how to
Variable Joint Elasticities in Running
Peter, Stephan; Grimmer, Sten; Lipfert, Susanne W.; Seyfarth, Andre
In this paper we investigate how spring-like leg behavior in human running is represented at joint level. We assume linear torsion springs in the joints and between the knee and the ankle joint. Using experimental data of the leg dynamics we compute how the spring parameters (stiffness and rest angles) change during gait cycle. We found that during contact the joints reveal elasticity with strongly changing parameters and compare the changes of different parameters for different spring arrangements. The results may help to design and improve biologically inspired spring mechanisms with adjustable parameters.
2001-01-01
Over forty years ago, the PS train entered service to steer the magnets of the accelerator into place... ... a service that was resumed last Tuesday. Left to right: Raymond Brown (CERN), Claude Tholomier (D.B.S.), Marcel Genolin (CERN), Gérard Saumade (D.B.S.), Ingo Ruehl (CERN), Olivier Carlier (D.B.S.), Patrick Poisot (D.B.S.), Christian Recour (D.B.S.). It is more than ten years since people at CERN heard the rumbling of the old PS train's steel wheels. Last Tuesday, the locomotive came back into service to be tested. It is nothing like the monstrous steel engines still running on conventional railways -just a small electric battery-driven vehicle employed on installing the magnets for the PS accelerator more than 40 years ago. To do so, it used the tracks that run round the accelerator. In fact, it is the grandfather of the LEP monorail. After PS was commissioned in 1959, the little train was used more and more rarely. This is because magnets never break down, or hardly ever! In fact, the loc...
Quantifying Monte Carlo uncertainty in ensemble Kalman filter
Energy Technology Data Exchange (ETDEWEB)
Thulin, Kristian; Naevdal, Geir; Skaug, Hans Julius; Aanonsen, Sigurd Ivar
2009-01-15
This report is presenting results obtained during Kristian Thulin PhD study, and is a slightly modified form of a paper submitted to SPE Journal. Kristian Thulin did most of his portion of the work while being a PhD student at CIPR, University of Bergen. The ensemble Kalman filter (EnKF) is currently considered one of the most promising methods for conditioning reservoir simulation models to production data. The EnKF is a sequential Monte Carlo method based on a low rank approximation of the system covariance matrix. The posterior probability distribution of model variables may be estimated fram the updated ensemble, but because of the low rank covariance approximation, the updated ensemble members become correlated samples from the posterior distribution. We suggest using multiple EnKF runs, each with smaller ensemble size to obtain truly independent samples from the posterior distribution. This allows a point-wise confidence interval for the posterior cumulative distribution function (CDF) to be constructed. We present a methodology for finding an optimal combination of ensemble batch size (n) and number of EnKF runs (m) while keeping the total number of ensemble members ( m x n) constant. The optimal combination of n and m is found through minimizing the integrated mean square error (MSE) for the CDFs and we choose to define an EnKF run with 10.000 ensemble members as having zero Monte Carlo error. The methodology is tested on a simplistic, synthetic 2D model, but should be applicable also to larger, more realistic models. (author). 12 refs., figs.,tabs
Run-up on Offshore Windturbine Foundations
DEFF Research Database (Denmark)
De Vos, Leen; Larsen, Brian Juul; Frigaard, Peter
For the present report a testprogramme has been performed to determine the run-up on offshore windturbine foundations.......For the present report a testprogramme has been performed to determine the run-up on offshore windturbine foundations....
Running Parallel Discrete Event Simulators on Sierra
Energy Technology Data Exchange (ETDEWEB)
Barnes, P. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jefferson, D. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-12-03
In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.
Is Running Bad for Your Knees?
... https://medlineplus.gov/news/fullstory_162903.html Is Running Bad for Your Knees? Study suggests it may ... THURSDAY, Jan. 5, 2017 (HealthDay News) -- Everybody believes running can leave you sore and swollen, right? Well, ...
LCG MCDB—a knowledgebase of Monte-Carlo simulated events
Belov, S.; Dudko, L.; Galkin, E.; Gusev, A.; Pokorski, W.; Sherstnev, A.
2008-02-01
) generators to prepare the events. For example, the same MC samples of Standard Model (SM) processes can be employed for the investigations either in the SM analyses (as a signal) or in searches for new phenomena in Beyond Standard Model analyses (as a background). If the samples are made available publicly and equipped with corresponding and comprehensive documentation, it can speed up cross checks of the samples themselves and physical models applied. Some event samples require a lot of computing resources for preparation. So, a central storage of the samples prevents possible waste of researcher time and computing resources, which can be used to prepare the same events many times. Solution method: Creation of a special knowledgebase (MCDB) designed to keep event samples for the LHC experimental and phenomenological community. The knowledgebase is realized as a separate web-server ( http://mcdb.cern.ch). All event samples are kept on types at CERN. Documentation describing the events is the main contents of MCDB. Users can browse the knowledgebase, read and comment articles (documentation), and download event samples. Authors can upload new event samples, create new articles, and edit own articles. Restrictions: The software is adopted to solve the problems, described in the article and there are no any additional restrictions. Unusual features: The software provides a framework to store and document large files with flexible authentication and authorization system. Different external storages with large capacity can be used to keep the files. The WEB Content Management System provides all of the necessary interfaces for the authors of the files, end-users and administrators. Running time: Real time operations. References: [1] The main LCG MCDB server, http://mcdb.cern.ch/. [2] P. Bartalini, L. Dudko, A. Kryukov, I.V. Selyuzhenkov, A. Sherstnev, A. Vologdin, LCG Monte-Carlo data base, hep-ph/0404241. [3] J.P. Baud, B. Couturier, C. Curran, J.D. Durand, E. Knezo, S. Occhetti
Robotic Bipedal Running: Increasing disturbance rejection
Karssen, J.G.D.
2013-01-01
The goal of the research presented in this thesis is to increase the understanding of the human running gait. The understanding of the human running gait is essential for the development of devices, such as prostheses and orthoses, that enable disabled people to run or that enable able people to inc
Wave Run-Up on Rubble Breakwaters
DEFF Research Database (Denmark)
Van de Walle, Bjorn; De Rouck, Julien; Troch, Peter
2005-01-01
Seven sets of data for wave run-up on a rubble mound breakwater were combined and re-analysed, with full-scale, large-scale and small-scale model test results being taken into account. The dimensionless wave run-up value Ru-2%/Hm0 was considered, where R u-2% is the wave run-up height exceeded by...
Head injury from a bungee run.
Singh, Pankaj; Convery, Fiona; Watt, Michael; Fulton, Ailsa; McKinstry, Steven; Flannery, Thomas
2012-04-01
An adaptation of bungee jumping, 'bungee running', involves participants attempting to run as far as they can whilst connected to an elastic rope which is anchored to a fixed point. Usually considered a safe recreational activity, we report a potentially life-threatening head injury following a bungee running accident.
Running Patterns of Highly Skilled Distance Runners.
Dunetts, Michael J.; Dillman, Charles J.
The biomechanical elements inherent in the running styles of Olympic-level athletes were examined in order to obtain a range of parameter values for specific running velocities. Forty-eight athletes participated in middle and long distance running events that were filmed and later analyzed to determine the relationship between the physical…
Barefoot running survey: Evidence from the field
Directory of Open Access Journals (Sweden)
David Hryvniak
2014-06-01
Conclusion: Prior studies have found that barefoot running often changes biomechanics compared to shod running with a hypothesized relationship of decreased injuries. This paper reports the result of a survey of 509 runners. The results suggest that a large percentage of this sample of runners experienced benefits or no serious harm from transitioning to barefoot or minimal shoe running.
Running with technology: Where are we heading?
DEFF Research Database (Denmark)
Jensen, Mads Møller; Mueller, Florian 'Floyd'
2014-01-01
Running has become popular in recent years, and numerous runners utilize wearable technologies in order to improve their run training. This paper investigates the development and trends in technologies used for run training, and describes how these are changing from solely focusing...
Monte Carlo approach to turbulence
Energy Technology Data Exchange (ETDEWEB)
Dueben, P.; Homeier, D.; Muenster, G. [Muenster Univ. (Germany). Inst. fuer Theoretische Physik; Jansen, K. [DESY, Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Mesterhazy, D. [Humboldt Univ., Berlin (Germany). Inst. fuer Physik
2009-11-15
The behavior of the one-dimensional random-force-driven Burgers equation is investigated in the path integral formalism on a discrete space-time lattice. We show that by means of Monte Carlo methods one may evaluate observables, such as structure functions, as ensemble averages over different field realizations. The regularization of shock solutions to the zero-viscosity limit (Hopf-equation) eventually leads to constraints on lattice parameters required for the stability of the simulations. Insight into the formation of localized structures (shocks) and their dynamics is obtained. (orig.)
Monte Carlo techniques in radiation therapy
Verhaegen, Frank
2013-01-01
Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...
NRMC - A GPU code for N-Reverse Monte Carlo modeling of fluids in confined media
Sánchez-Gil, Vicente; Noya, Eva G.; Lomba, Enrique
2017-08-01
NRMC is a parallel code for performing N-Reverse Monte Carlo modeling of fluids in confined media [V. Sánchez-Gil, E.G. Noya, E. Lomba, J. Chem. Phys. 140 (2014) 024504]. This method is an extension of the usual Reverse Monte Carlo method to obtain structural models of confined fluids compatible with experimental diffraction patterns, specifically designed to overcome the problem of slow diffusion that can appear under conditions of tight confinement. Most of the computational time in N-Reverse Monte Carlo modeling is spent in the evaluation of the structure factor for each trial configuration, a calculation that can be easily parallelized. Implementation of the structure factor evaluation in NVIDIA® CUDA so that the code can be run on GPUs leads to a speed up of up to two orders of magnitude.
Implementation of 3D Lattice Monte Carlo Simulation on a Cluster of Symmetric Multiprocessors
Institute of Scientific and Technical Information of China (English)
雷咏梅; 蒋英; 等
2002-01-01
This paper presents a new approach to parallelize 3D lattice Monte Carlo algorithms used in the numerical simulation of polymer on ZiQiang 2000-a cluster of symmetric multiprocessors(SMPs).The combined load for cell and energy calculations over the time step is balanced together to form a single spatial decomposition.Basic aspects and strategies of running Monte Carlo calculations on parallel computers are studied.Different steps involved in porting the software on a parallel architecture based on ZiQiang 2000 running under Linux and MPI are described briefly.It is found that parallelization becomes more advantageous when either the lattice is very large or the model contains many cells and chains.
Approaching Chemical Accuracy with Quantum Monte Carlo
Petruzielo, Frank R.; Toulouse, Julien; Umrigar, C. J.
2012-01-01
International audience; A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreem...
Mean field simulation for Monte Carlo integration
Del Moral, Pierre
2013-01-01
In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko
Scalable Metropolis Monte Carlo for simulation of hard shapes
Anderson, Joshua A.; Eric Irrgang, M.; Glotzer, Sharon C.
2016-07-01
We design and implement a scalable hard particle Monte Carlo simulation toolkit (HPMC), and release it open source as part of HOOMD-blue. HPMC runs in parallel on many CPUs and many GPUs using domain decomposition. We employ BVH trees instead of cell lists on the CPU for fast performance, especially with large particle size disparity, and optimize inner loops with SIMD vector intrinsics on the CPU. Our GPU kernel proposes many trial moves in parallel on a checkerboard and uses a block-level queue to redistribute work among threads and avoid divergence. HPMC supports a wide variety of shape classes, including spheres/disks, unions of spheres, convex polygons, convex spheropolygons, concave polygons, ellipsoids/ellipses, convex polyhedra, convex spheropolyhedra, spheres cut by planes, and concave polyhedra. NVT and NPT ensembles can be run in 2D or 3D triclinic boxes. Additional integration schemes permit Frenkel-Ladd free energy computations and implicit depletant simulations. In a benchmark system of a fluid of 4096 pentagons, HPMC performs 10 million sweeps in 10 min on 96 CPU cores on XSEDE Comet. The same simulation would take 7.6 h in serial. HPMC also scales to large system sizes, and the same benchmark with 16.8 million particles runs in 1.4 h on 2048 GPUs on OLCF Titan.
Biomechanics and analysis of running gait.
Dugan, Sheila A; Bhat, Krishna P
2005-08-01
Physical activity, including running, is important to general health by way of prevention of chronic illnesses and their precursors. To keep runners healthy, it is paramount that one has sound knowledge of the biomechanics of running and assessment of running gait. More so, improving performance in competitive runners is based in sound training and rehabilitation practices that are rooted firmly in biomechanical principles. This article summarized the biomechanics of running and the means with which one can evaluate running gait. The gait assessment techniques for collecting and analyzing kinetic and kinematic data can provide insights into injury prevention and treatment and performance enhancement.
Are multiple runs better than one?
Energy Technology Data Exchange (ETDEWEB)
Cantu-Paz, E
2001-01-04
This paper investigates whether it is better to use a certain constant amount of computational resources in a single run with a large population, or in multiple runs with smaller populations. The paper presents the primary tradeoffs involved in this problem and identifies the conditions under which there is an advantage to use multiple small runs. The paper uses an existing model that relates the quality of the solutions reached by a GA with its population size. The results suggest that in most cases a single run with the largest population possible reaches a better solution than multiple isolated runs. The findings are validated with experiments on functions of varying difficulty.
Monte Carlo Treatment Planning for Advanced Radiotherapy
DEFF Research Database (Denmark)
Cronholm, Rickard
and validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...... previous algorithms since it uses delineations of structures in order to include and/or exclude certain media in various anatomical regions. This method has the potential to reduce anatomically irrelevant media assignment. In house MATLAB scripts translating the treatment plan parameters to Monte Carlo...
1-D EQUILIBRIUM DISCRETE DIFFUSION MONTE CARLO
Energy Technology Data Exchange (ETDEWEB)
T. EVANS; ET AL
2000-08-01
We present a new hybrid Monte Carlo method for 1-D equilibrium diffusion problems in which the radiation field coexists with matter in local thermodynamic equilibrium. This method, the Equilibrium Discrete Diffusion Monte Carlo (EqDDMC) method, combines Monte Carlo particles with spatially discrete diffusion solutions. We verify the EqDDMC method with computational results from three slab problems. The EqDDMC method represents an incremental step toward applying this hybrid methodology to non-equilibrium diffusion, where it could be simultaneously coupled to Monte Carlo transport.
The research program of the Liquid Scintillation Detector (LSD) in the Mont Blanc Laboratory
Dadykin, V. L.; Yakushev, V. F.; Korchagin, P. V.; Korchagin, V. B.; Malgin, A. S.; Ryassny, F. G.; Ryazhskaya, O. G.; Talochkin, V. P.; Zatsepin, G. T.; Badino, G.
1985-01-01
A massive (90 tons) liquid scintillation detector (LSD) has been running since October 1984 in the Mont Blanc Laboratory at a depth of 5,200 hg/sq cm of standard rock. The research program of the experiment covers a variety of topics in particle physics and astrophysics. The performance of the detector, the main fields of research are presented and the preliminary results are discussed.
GPUMCD: a new GPU-oriented Monte Carlo dose calculation platform
Hissoiny, Sami; Ozell, Benoît; Després, Philippe
2011-01-01
Purpose: Monte Carlo methods are considered the gold standard for dosimetric computations in radiotherapy. Their execution time is however still an obstacle to the routine use of Monte Carlo packages in a clinical setting. To address this problem, a completely new, and designed from the ground up for the GPU, Monte Carlo dose calculation package for voxelized geometries is proposed: GPUMCD. Method : GPUMCD implements a coupled photon-electron Monte Carlo simulation for energies in the range 0.01 MeV to 20 MeV. An analogue simulation of photon interactions is used and a Class II condensed history method has been implemented for the simulation of electrons. A new GPU random number generator, some divergence reduction methods as well as other optimization strategies are also described. GPUMCD was run on a NVIDIA GTX480 while single threaded implementations of EGSnrc and DPM were run on an Intel Core i7 860. Results : Dosimetric results obtained with GPUMCD were compared to EGSnrc. In all but one test case, 98% o...
Error in Monte Carlo, quasi-error in Quasi-Monte Carlo
Kleiss, R. H. P.; Lazopoulos, A.
2006-01-01
While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction o...
What we can learn about running from barefoot running: an evolutionary medical perspective.
Lieberman, Daniel E
2012-04-01
Barefoot running, which was how people ran for millions of years, provides an opportunity to study how natural selection adapted the human body to run. Because humans evolved to run barefoot, a barefoot running style that minimizes impact peaks and provides increased proprioception and foot strength, is hypothesized to help avoid injury, regardless of whether one is wearing shoes.
A Globally Convergent Algorithm for the Run-to-Run Control of Systems with Sector Nonlinearities
François, Grégory; Srinivasan, Balasubrahmanya; Bonvin, Dominique
2011-01-01
Run-to-run control is a technique that exploits the repetitive nature of processes to iteratively adjust the inputs and drive the run-end outputs to their reference values. It can be used to control both static and finite-time dynamic systems. Although the run-end outputs of dynamic systems result from the integration of process dynamics during the run, the relationship between the input parameters p (fixed at the beginning of the run) and the run-end outputs z (available at the end of t...
Daytime Running Lights. Public Consultation
Energy Technology Data Exchange (ETDEWEB)
NONE
2009-12-15
The Road Safety Authority is considering the policy options available to promote the use of Daytime Running Lights (DRL), including the possibility of mandating the use of DRL on all vehicles. An EC Directive would make DRL mandatory for new vehicles from 2011 onwards and by 2024 it is predicted that due to the natural replacement of the national fleet, almost all vehicles would be equipped with DRL. The RSA is inviting views on introducing DRL measures earlier, whereby all road vehicles would be required to use either dipped head lights during hours of daylight or dedicated DRL from next year onwards. The use of DRL has been found to enhance the visibility of vehicles, thereby increasing road safety by reducing the number and severity of collisions. This paper explores the benefits of DRL and the implications for all road users including pedestrians, cyclists and motorcyclists. In order to ensure a comprehensive consideration of all the issues, the Road Safety Authority is seeking the views and advice of interested parties.
Padulo, Johnny; Powell, Douglas; Milia, Raffaele; Ardigò, Luca Paolo
2013-01-01
The biomechanical management of bioenergetics of runners when running uphill was investigated. Several metabolic and mechanical variables have been studied simultaneously to spread light on the locomotory strategy operated by humans for effective locomotion. The studied variables were: heart rate, heart rate variability, oxygen intake and blood lactate, metabolic cost, kinematics, ground reaction force and muscular activity. 18 high-level competitive male runners ran at 70% VO2max on different uphill slope conditions: 0%, 2% and 7%. Modifications were significant in almost all variables studied, and were more pronounced with increasing incline. Step frequency/length and ground reaction force are adjusted to cope with both the task of uphill progression and the available (limited) metabolic power. From 0% to 7% slope, step frequency and ground reaction force and metabolic cost increased concurrently by 4%, 12% and 53%, respectively (with a 4% step length decrease as well). It is hypothesised that this biomechanical management is allowed by an environment-body communication performed by means of specific muscular activity. PMID:23874850
CERN Bulletin
2010-01-01
LHCf, one of the three smaller experiments at the LHC, has completed its first run. The detectors were removed last week and the analysis of data is continuing. The first results will be ready by the end of the year. One of the two LHCf detectors during the removal operations inside the LHC tunnel. LHCf is made up of two independent detectors located in the tunnel 140 m either side of the ATLAS collision point. The experiment studies the secondary particles created during the head-on collisions in the LHC because they are similar to those created in a cosmic ray shower produced when a cosmic particle hits the Earth’s atmosphere. The focus of the experiment is to compare the various shower models used to estimate the primary energy of ultra-high-energy cosmic rays. The energy of proton-proton collisions at the LHC will be equivalent to a cosmic ray of 1017eV hitting the atmosphere, very close to the highest energies observed in the sky. “We have now completed the fir...
Diphoton Excess and Running Couplings
Bae, Kyu Jung; Hamaguchi, Koichi; Moroi, Takeo
2016-01-01
The recently observed diphoton excess at the LHC may suggest the existence of a singlet (pseudo-) scalar particle with a mass of 750 GeV which couples to gluons and photons. Assuming that the couplings to gluons and photons originate from loops of fermions and/or scalars charged under the Standard Model gauge groups, we show that here is a model-independent upper bound on the cross section $\\sigma(pp\\to S\\to \\gamma\\gamma)$ as a function of the cutoff scale $\\Lambda$ and masses of the fermions and scalars in the loop. Such a bound comes from the fact that the contribution of each particle to the diphoton event amplitude is proportional to its contribution to the one-loop $\\beta$ functions of the gauge couplings. We also investigate the perturbativity of running Yukawa couplings in models with fermion loops, and show the upper bounds on $\\sigma(pp\\to S\\to \\gamma\\gamma)$ for explicit models.
Impact Accelerations of Barefoot and Shod Running.
Thompson, M; Seegmiller, J; McGowan, C P
2016-05-01
During the ground contact phase of running, the body's mass is rapidly decelerated resulting in forces that propagate through the musculoskeletal system. The repetitive attenuation of these impact forces is thought to contribute to overuse injuries. Modern running shoes are designed to reduce impact forces, with the goal to minimize running related overuse injuries. Additionally, the fore/mid foot strike pattern that is adopted by most individuals when running barefoot may reduce impact force transmission. The aim of the present study was to compare the effects of the barefoot running form (fore/mid foot strike & decreased stride length) and running shoes on running kinetics and impact accelerations. 10 healthy, physically active, heel strike runners ran in 3 conditions: shod, barefoot and barefoot while heel striking, during which 3-dimensional motion analysis, ground reaction force and accelerometer data were collected. Shod running was associated with increased ground reaction force and impact peak magnitudes, but decreased impact accelerations, suggesting that the midsole of running shoes helps to attenuate impact forces. Barefoot running exhibited a similar decrease in impact accelerations, as well as decreased impact peak magnitude, which appears to be due to a decrease in stride length and/or a more plantarflexed position at ground contact.
Determinants Of Savings Behavior In Pakistan: Long Run - Short Run Association And Causality
Ahmad Fawad
2015-01-01
The existing studies on private savings have mostly investigated the long run and short association of different variables with private savings, whereas no known study has investigated both long run and short run causality of variables against private savings by using data of Pakistan. The current study used time series data of Pakistan over the period of 1972 to 2012 and employed long run cointegration test, first normalized equation for long run association, vector error correction model fo...
Langevin Monte Carlo filtering for target tracking
Iglesias Garcia, Fernando; Bocquel, Melanie; Driessen, Hans
2015-01-01
This paper introduces the Langevin Monte Carlo Filter (LMCF), a particle filter with a Markov chain Monte Carlo algorithm which draws proposals by simulating Hamiltonian dynamics. This approach is well suited to non-linear filtering problems in high dimensional state spaces where the bootstrap filte
An introduction to Monte Carlo methods
Walter, J. -C.; Barkema, G. T.
2015-01-01
Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo sim
An introduction to Monte Carlo methods
Walter, J. -C.; Barkema, G. T.
2015-01-01
Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo sim
LHC Report: Run 1 – the final flurry
Mike Lamont for the LHC team
2013-01-01
The proton-lead run ended early on the morning of Sunday, 10 February. The run can be considered an unqualified success and a testament to the painstaking preparation by the ion team. It was followed by a few short days of proton-proton collisions at intermediate energy, after which the final physics beams of what is now being called Run 1 (2009 – 2013) were dumped at 07:24 on Thursday, 14 February. The five weeks of operations originally scheduled for 2013 had two main objectives: the delivery of 30 inverse nanobarns with proton-lead collisions; and around 5 inverse picobarns of proton-proton collisions at a beam energy of 1.38 TeV. Both of these objectives were met. As described in previous reports, the proton-lead run has gone remarkably well for a completely novel operational mode. However, there were some issues following the switch of beam direction on Friday, 1 February. In this exercise the ions become the clockwise beam and the experiments received lead-proton instead of ...
Brown, James W., Comp; And Others
The 1967 Monte Corona School Library Workshop for Leadership Personnel, seventh in a series of summer workshops, focuses on school library programs and services; particularly as these relate to a cross-media approach to curriculum implementation. This workshop is designed primarily for school library and audio-visual education leadership personnel…
Replica exchange Monte Carlo applied to hard spheres.
Odriozola, Gerardo
2009-10-14
In this work a replica exchange Monte Carlo scheme which considers an extended isobaric-isothermal ensemble with respect to pressure is applied to study hard spheres (HSs). The idea behind the proposal is expanding volume instead of increasing temperature to let crowded systems characterized by dominant repulsive interactions to unblock, and so, to produce sampling from disjoint configurations. The method produces, in a single parallel run, the complete HS equation of state. Thus, the first order fluid-solid transition is captured. The obtained results well agree with previous calculations. This approach seems particularly useful to treat purely entropy-driven systems such as hard body and nonadditive hard mixtures, where temperature plays a trivial role.
Improved version of the PHOBOS Glauber Monte Carlo
Loizides, C; Steinberg, P
2014-01-01
Glauber models are used to calculate geometric quantities in the initial state of heavy ion collisions, such as impact parameter, number of participating nucleons and initial eccentricity. Experimental heavy-ion collaboration, in particular at RHIC and LHC, use Glauber Model calculations for various geometric observables. In this document, we describe the assumptions inherent to the approach, and provide an updated implementation (v2) of the Monte Carlo based Glauber Model calculation, which originally was used by the PHOBOS collaboration. The main improvement w.r.t. the earlier version (arXiv:0805.4411) are the inclusion of tritium, Helium-3, and Uranium, as well as the treatment of deformed nuclei and Glauber-Gribov fluctuations of the proton in p+A collisions. A users' guide (updated to reflect changes in v2) is provided for running various calculations.
Dynamic Partitioning of GATE Monte-Carlo Simulations on EGEE
Camarasu-Pop, S; Benoit-Cattin, Hugues; Glatard, Tristan; Sarrut, David; Camarasu-Pop, Sorina
2010-01-01
The EGEE Grid offers the necessary infrastructure and resources for reducing the running time of particle tracking Monte-Carlo applications like GATE. However, efforts are required to achieve reliable and efficient execution and to provide execution frameworks to end-users. This paper presents results obtained with porting the GATE software on the EGEE Grid, our ultimate goal being to provide reliable, user-friendly and fast execution of GATE to radiation therapy researchers. To address these requirements, we propose a new parallelization scheme based on a dynamic partitioning and its implementation in two different frameworks using pilot jobs and workflows. Results show that pilot jobs bring strong improvement w.r.t. regular gLite submission, that the proposed dynamic partitioning algorithm further reduces execution time by a factor of two and that the genericity and user-friendliness offered by the workflow implementation do not introduce significant overhead.
The MC21 Monte Carlo Transport Code
Energy Technology Data Exchange (ETDEWEB)
Sutton TM, Donovan TJ, Trumbull TH, Dobreff PS, Caro E, Griesheimer DP, Tyburski LJ, Carpenter DC, Joo H
2007-01-09
MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities.
Quantum Monte Carlo: Faster, More Reliable, And More Accurate
Anderson, Amos Gerald
2010-06-01
combination of Generalized Valence Bond wavefunctions, improved correlation functions, and stabilized weighting techniques for calculations run on graphics cards, represents a new way for using Quantum Monte Carlo to study arbitrarily sized molecules.
Xenon instability study of large core Monte Carlo calculations
Energy Technology Data Exchange (ETDEWEB)
Bogdanova, E.V. [National Research Nuclear University ' MEPHi' , Moscow (Russian Federation); Gorodkov, S.S.
2016-09-15
One of the goals of neutronic calculations of large cores may be self-consistent distribution of equilibrium xenon through the reactor core. In deterministic calculations such self consistency is relatively simply achieved with the help of additional outer iterations by xenon, which can increase several times solution run time. But in stochastic calculation of large cores such increase is utterly undesirable, since even without these outer iterations it demands modeling of billion of histories, which in case of complicated large core may take about a day of 100 processors work. In addition the unavoidable statistical uncertainty here plays role of transient process, which excites xenon oscillations. In this work the rise of such oscillations and the way of their overcoming with the help of hybrid stochastic/deterministic calculation is studied. It is proposed to make at first single static Monte Carlo calculation of given core and to receive multi-group mesh cell characteristics for future use in operative code. This one will evaluate xenon distribution through the core, which will be equilibrium for deterministic solution and substantially close to equilibrium Monte Carlo solution, paid with enormous computing cost.
A pure-sampling quantum Monte Carlo algorithm.
Ospadov, Egor; Rothstein, Stuart M
2015-01-14
The objective of pure-sampling quantum Monte Carlo is to calculate physical properties that are independent of the importance sampling function being employed in the calculation, save for the mismatch of its nodal hypersurface with that of the exact wave function. To achieve this objective, we report a pure-sampling algorithm that combines features of forward walking methods of pure-sampling and reptation quantum Monte Carlo (RQMC). The new algorithm accurately samples properties from the mixed and pure distributions simultaneously in runs performed at a single set of time-steps, over which extrapolation to zero time-step is performed. In a detailed comparison, we found RQMC to be less efficient. It requires different sets of time-steps to accurately determine the energy and other properties, such as the dipole moment. We implement our algorithm by systematically increasing an algorithmic parameter until the properties converge to statistically equivalent values. As a proof in principle, we calculated the fixed-node energy, static α polarizability, and other one-electron expectation values for the ground-states of LiH and water molecules. These quantities are free from importance sampling bias, population control bias, time-step bias, extrapolation-model bias, and the finite-field approximation. We found excellent agreement with the accepted values for the energy and a variety of other properties for those systems.
Monte Carlo Criticality Methods and Analysis Capabilities in SCALE
Energy Technology Data Exchange (ETDEWEB)
Goluoglu, Sedat [ORNL; Petrie Jr, Lester M [ORNL; Dunn, Michael E [ORNL; Hollenbach, Daniel F [ORNL; Rearden, Bradley T [ORNL
2011-01-01
This paper describes the Monte Carlo codes KENO V.a and KENO-VI in SCALE that are primarily used to calculate multiplication factors and flux distributions of fissile systems. Both codes allow explicit geometric representation of the target systems and are used internationally for safety analyses involving fissile materials. KENO V.a has limiting geometric rules such as no intersections and no rotations. These limitations make KENO V.a execute very efficiently and run very fast. On the other hand, KENO-VI allows very complex geometric modeling. Both KENO codes can utilize either continuous-energy or multigroup cross-section data and have been thoroughly verified and validated with ENDF libraries through ENDF/B-VII.0, which has been first distributed with SCALE 6. Development of the Monte Carlo solution technique and solution methodology as applied in both KENO codes is explained in this paper. Available options and proper application of the options and techniques are also discussed. Finally, performance of the codes is demonstrated using published benchmark problems.
Energy Technology Data Exchange (ETDEWEB)
Mukhopadhyay, Nitai D. [Department of Biostatistics, Virginia Commonwealth University, Richmond, VA 23298 (United States); Sampson, Andrew J. [Department of Radiation Oncology, Virginia Commonwealth University, Richmond, VA 23298 (United States); Deniz, Daniel; Alm Carlsson, Gudrun [Department of Radiation Physics, Faculty of Health Sciences, Linkoeping University, SE 581 85 (Sweden); Williamson, Jeffrey [Department of Radiation Oncology, Virginia Commonwealth University, Richmond, VA 23298 (United States); Malusek, Alexandr, E-mail: malusek@ujf.cas.cz [Department of Radiation Physics, Faculty of Health Sciences, Linkoeping University, SE 581 85 (Sweden); Department of Radiation Dosimetry, Nuclear Physics Institute AS CR v.v.i., Na Truhlarce 39/64, 180 86 Prague (Czech Republic)
2012-01-15
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed.
Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr
2012-01-01
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed.
Spike Inference from Calcium Imaging using Sequential Monte Carlo Methods
NeuroData; Paninski, L
2015-01-01
Vogelstein JT, Paninski L. Spike Inference from Calcium Imaging using Sequential Monte Carlo Methods. Statistical and Applied Mathematical Sciences Institute (SAMSI) Program on Sequential Monte Carlo Methods, 2008
The Effect of Training in Minimalist Running Shoes on Running Economy.
Ridge, Sarah T; Standifird, Tyler; Rivera, Jessica; Johnson, A Wayne; Mitchell, Ulrike; Hunter, Iain
2015-09-01
The purpose of this study was to examine the effect of minimalist running shoes on oxygen uptake during running before and after a 10-week transition from traditional to minimalist running shoes. Twenty-five recreational runners (no previous experience in minimalist running shoes) participated in submaximal VO2 testing at a self-selected pace while wearing traditional and minimalist running shoes. Ten of the 25 runners gradually transitioned to minimalist running shoes over 10 weeks (experimental group), while the other 15 maintained their typical training regimen (control group). All participants repeated submaximal VO2 testing at the end of 10 weeks. Testing included a 3 minute warm-up, 3 minutes of running in the first pair of shoes, and 3 minutes of running in the second pair of shoes. Shoe order was randomized. Average oxygen uptake was calculated during the last minute of running in each condition. The average change from pre- to post-training for the control group during testing in traditional and minimalist shoes was an improvement of 3.1 ± 15.2% and 2.8 ± 16.2%, respectively. The average change from pre- to post-training for the experimental group during testing in traditional and minimalist shoes was an improvement of 8.4 ± 7.2% and 10.4 ± 6.9%, respectively. Data were analyzed using a 2-way repeated measures ANOVA. There were no significant interaction effects, but the overall improvement in running economy across time (6.15%) was significant (p = 0.015). Running in minimalist running shoes improves running economy in experienced, traditionally shod runners, but not significantly more than when running in traditional running shoes. Improvement in running economy in both groups, regardless of shoe type, may have been due to compliance with training over the 10-week study period and/or familiarity with testing procedures. Key pointsRunning in minimalist footwear did not result in a change in running economy compared to running in traditional footwear
The macro response Monte Carlo method for electron transport
Energy Technology Data Exchange (ETDEWEB)
Svatos, M M
1998-09-01
The main goal of this thesis was to prove the feasibility of basing electron depth dose calculations in a phantom on first-principles single scatter physics, in an amount of time that is equal to or better than current electron Monte Carlo methods. The Macro Response Monte Carlo (MRMC) method achieves run times that are on the order of conventional electron transport methods such as condensed history, with the potential to be much faster. This is possible because MRMC is a Local-to-Global method, meaning the problem is broken down into two separate transport calculations. The first stage is a local, in this case, single scatter calculation, which generates probability distribution functions (PDFs) to describe the electron's energy, position and trajectory after leaving the local geometry, a small sphere or "kugel" A number of local kugel calculations were run for calcium and carbon, creating a library of kugel data sets over a range of incident energies (0.25 MeV - 8 MeV) and sizes (0.025 cm to 0.1 cm in radius). The second transport stage is a global calculation, where steps that conform to the size of the kugels in the library are taken through the global geometry. For each step, the appropriate PDFs from the MRMC library are sampled to determine the electron's new energy, position and trajectory. The electron is immediately advanced to the end of the step and then chooses another kugel to sample, which continues until transport is completed. The MRMC global stepping code was benchmarked as a series of subroutines inside of the Peregrine Monte Carlo code. It was compared to Peregrine's class II condensed history electron transport package, EGS4, and MCNP for depth dose in simple phantoms having density inhomogeneities. Since the kugels completed in the library were of relatively small size, the zoning of the phantoms was scaled down from a clinical size, so that the energy deposition algorithms for spreading dose across 5-10 zones per kugel could
Energy Technology Data Exchange (ETDEWEB)
Hong, Tianzhen; Buhl, Fred; Haves, Philip
2008-09-20
EnergyPlus is a new generation building performance simulation program offering many new modeling capabilities and more accurate performance calculations integrating building components in sub-hourly time steps. However, EnergyPlus runs much slower than the current generation simulation programs. This has become a major barrier to its widespread adoption by the industry. This paper analyzed EnergyPlus run time from comprehensive perspectives to identify key issues and challenges of speeding up EnergyPlus: studying the historical trends of EnergyPlus run time based on the advancement of computers and code improvements to EnergyPlus, comparing EnergyPlus with DOE-2 to understand and quantify the run time differences, identifying key simulation settings and model features that have significant impacts on run time, and performing code profiling to identify which EnergyPlus subroutines consume the most amount of run time. This paper provides recommendations to improve EnergyPlus run time from the modeler?s perspective and adequate computing platforms. Suggestions of software code and architecture changes to improve EnergyPlus run time based on the code profiling results are also discussed.
Training errors and running related injuries
DEFF Research Database (Denmark)
Nielsen, Rasmus Østergaard; Buist, Ida; Sørensen, Henrik
2012-01-01
The purpose of this systematic review was to examine the link between training characteristics (volume, duration, frequency, and intensity) and running related injuries.......The purpose of this systematic review was to examine the link between training characteristics (volume, duration, frequency, and intensity) and running related injuries....
Minimum Wage Effects in the Longer Run
Neumark, David; Nizalova, Olena
2007-01-01
Exposure to minimum wages at young ages could lead to adverse longer-run effects via decreased labor market experience and tenure, and diminished education and training, while beneficial longer-run effects could arise if minimum wages increase skill acquisition. Evidence suggests that as individuals reach their late 20s, they earn less the longer…
Training errors and running related injuries
DEFF Research Database (Denmark)
Nielsen, Rasmus Østergaard; Buist, Ida; Sørensen, Henrik;
2012-01-01
The purpose of this systematic review was to examine the link between training characteristics (volume, duration, frequency, and intensity) and running related injuries.......The purpose of this systematic review was to examine the link between training characteristics (volume, duration, frequency, and intensity) and running related injuries....
Impact of Running Away on Girls' Pregnancy
Thrane, Lisa E.; Chen, Xiaojin
2012-01-01
This study assessed the impact of running away on pregnancy in the subsequent year among U.S. adolescents. We also investigated interactions between running away and sexual assault, romance, and school disengagement. Pregnancy among females between 11 and 17 years (n = 6100) was examined utilizing the Longitudinal Study of Adolescent Health (Add…
Running biomechanics: shorter heels, better economy.
Scholz, M N; Bobbert, M F; van Soest, A J; Clark, J R; van Heerden, J
2008-10-01
Better running economy (i.e. a lower rate of energy consumption at a given speed) is correlated with superior distance running performance. There is substantial variation in running economy, even among elite runners. This variation might be due to variation in the storage and reutilization of elastic energy in tendons. Using a simple musculoskeletal model, it was predicted that the amount of energy stored in a tendon during a given movement depends more critically on moment arm than on mechanical properties of the tendon, with the amount of stored energy increasing as the moment arm gets smaller. Assuming a link between elastic energy reutilization and overall metabolic cost of running, a smaller moment arm should therefore be associated with superior running economy. This prediction was confirmed experimentally in a group of 15 highly trained runners. The moment arm of the Achilles tendon was determined from standardized photographs of the ankle, using the position of anatomical landmarks. Running economy was measured as the rate of metabolic energy consumption during level treadmill running at a speed of 16 km h(-1). A strong correlation was found between the moment arm of the Achilles tendon and running economy. Smaller muscle moment arms correlated with lower rates of metabolic energy consumption (r(2)=0.75, P<0.001).
Biomechanics of Distance Running: A Longitudinal Study
Nelson, Richard C.; Gregor, Robert J.
1976-01-01
Training for distance running over a long period produces meaningful changes in the running mechanics of experienced runners, as revealed in this longitudinal study of the biomechanical components of stride length, stride rate, stride time, and support and nonsupport time. (MB)
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Test run. 92.126 Section 92.126... POLLUTION FROM LOCOMOTIVES AND LOCOMOTIVE ENGINES Test Procedures § 92.126 Test run. (a) The following steps... water from the pretest value, the test is void. (7)(i) For bag samples, as soon as possible transfer...
Teaching Bank Runs with Classroom Experiments
Balkenborg, Dieter; Kaplan, Todd; Miller, Timothy
2011-01-01
Once relegated to cinema or history lectures, bank runs have become a modern phenomenon that captures the interest of students. In this article, the authors explain a simple classroom experiment based on the Diamond-Dybvig model (1983) to demonstrate how a bank run--a seemingly irrational event--can occur rationally. They then present possible…
2012-01-01
Two views of t-shirts with "Run for 32" written on them The "Run for 32" race team, sponsored by TechSideline.com, participated in the SunTrust Rock 'n' Roll Half-Marathon, September 2, 2007. Shirt is inscribed with the names of the victims.; Compound Object
Concepts for fast large scale Monte Carlo production for the ATLAS experiment
Debenedetti, C; The ATLAS collaboration
2013-01-01
The huge success of Run 1 of the LHC would not have been possible without detailed detector simulation of the experiments. The outstanding performance of the accelerator with a delivered integrated luminosity of 25 fb-1 has created an unprecedented demand for large simulated event samples. This has stretched the possibilities of the experiments due to the constraint of their computing infrastructure and available resources. Modern, concurrent computing techniques optimized for new processor hardware are being exploit to boost future computing resources, but even most optimistic scenarios predict that additional action needs to be taken to guarantee sufficient Monte Carlo production statistics for high quality physics results during Run 2. In recent years, the ATLAS collaboration has put dedicated effort in the development of a new Integrated Simulation Framework (ISF) that allows running full and fast simulation approaches in parallel and even within one event. We present the main concepts of the ISF, which a...
Monte Carlo approaches to light nuclei
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.
Monte carlo simulation for soot dynamics
Zhou, Kun
2012-01-01
A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
Lattice gauge theories and Monte Carlo simulations
Rebbi, Claudio
1983-01-01
This volume is the most up-to-date review on Lattice Gauge Theories and Monte Carlo Simulations. It consists of two parts. Part one is an introductory lecture on the lattice gauge theories in general, Monte Carlo techniques and on the results to date. Part two consists of important original papers in this field. These selected reprints involve the following: Lattice Gauge Theories, General Formalism and Expansion Techniques, Monte Carlo Simulations. Phase Structures, Observables in Pure Gauge Theories, Systems with Bosonic Matter Fields, Simulation of Systems with Fermions.
Quantum Monte Carlo for minimum energy structures
Wagner, Lucas K
2010-01-01
We present an efficient method to find minimum energy structures using energy estimates from accurate quantum Monte Carlo calculations. This method involves a stochastic process formed from the stochastic energy estimates from Monte Carlo that can be averaged to find precise structural minima while using inexpensive calculations with moderate statistical uncertainty. We demonstrate the applicability of the algorithm by minimizing the energy of the H2O-OH- complex and showing that the structural minima from quantum Monte Carlo calculations affect the qualitative behavior of the potential energy surface substantially.
Orthopaedic Perspective on Barefoot and Minimalist Running.
Roth, Jonathan; Neumann, Julie; Tao, Matthew
2016-03-01
In recent years, there has been a movement toward barefoot and minimalist running. Advocates assert that a lack of cushion and support promotes a forefoot or midfoot strike rather than a rearfoot strike, decreasing the impact transient and stress on the hip and knee. Although the change in gait is theorized to decrease injury risk, this concept has not yet been fully elucidated. However, research has shown diminished symptoms of chronic exertional compartment syndrome and anterior knee pain after a transition to minimalist running. Skeptics are concerned that, because of the effects of the natural environment and the lack of a standardized transition program, barefoot running could lead to additional, unforeseen injuries. Studies have shown that, with the transition to minimalist running, there is increased stress on the foot and ankle and risk of repetitive stress injuries. Nonetheless, despite the large gap of evidence-based knowledge on minimalist running, the potential benefits warrant further research and consideration.
Middle cerebral artery blood velocity during running
DEFF Research Database (Denmark)
Lyngeraa, Tobias; Pedersen, Lars Møller; Mantoni, T
2013-01-01
for eight subjects, respectively, were excluded from analysis because of insufficient signal quality. Running increased mean arterial pressure and mean MCA velocity and induced rhythmic oscillations in BP and in MCA velocity corresponding to the difference between step rate and heart rate (HR) frequencies......) blood flow velocity, photoplethysmographic finger BP, and step frequency were measured continuously during three consecutive 5-min intervals of treadmill running at increasing running intensities. Data were analysed in the time and frequency domains. BP data for seven subjects and MCA velocity data....... During running, rhythmic oscillations in arterial BP induced by interference between HR and step frequency impact on cerebral blood velocity. For the exercise as a whole, average MCA velocity becomes elevated. These results suggest that running not only induces an increase in regional cerebral blood flow...
A Study on Selection Performance for High Energy Photons in CMS towards the 13 TeV LHC Run
Scerri, Dale
2014-01-01
The 13 TeV run for the LHC is expected to occur in the beginning of 2015 with a higher number of interactions per bunch crossing and a reduced bunch crossing time spacing. These harsher conditions are expected to affect the photon selection performance especially for high energy photons involved in BSM processes like the Graviton decay. This work summarizes a preliminary study of this performance using 13 TeV Monte Carlo Randall Sundrum Graviton samples with different pileup scenarios corresponding to the coming 13 TeV run conditions. Especially the efficiency for the selection and the energy resolution are investigated.
11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing
Nuyens, Dirk
2016-01-01
This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.
Rocker shoe, minimalist shoe, and standard running shoe : A comparison of running economy
Sobhani, Sobhan; Bredeweg, Steven; Dekker, Rienk; Kluitenberg, Bas; van den Heuvel, Edwin; Hijmans, Juha; Postema, Klaas
2014-01-01
Objectives: Running with rocker shoes is believed to prevent lower limb injuries. However, it is not clear how running in these shoes affects the energy expenditure. The purpose of this study was, therefore, to assess the effects of rocker shoes on running economy in comparison with standard and min
A Runs-Test Algorithm: Contingent Reinforcement and Response Run Structures
Hachiga, Yosuke; Sakagami, Takayuki
2010-01-01
Four rats' choices between two levers were differentially reinforced using a runs-test algorithm. On each trial, a runs-test score was calculated based on the last 20 choices. In Experiment 1, the onset of stimulus lights cued when the runs score was smaller than criterion. Following cuing, the correct choice was occasionally reinforced with food,…
Rocker shoe, minimalist shoe, and standard running shoe : A comparison of running economy
Sobhani, Sobhan; Bredeweg, Steven; Dekker, Rienk; Kluitenberg, Bas; van den Heuvel, Edwin; Hijmans, Juha; Postema, Klaas
Objectives: Running with rocker shoes is believed to prevent lower limb injuries. However, it is not clear how running in these shoes affects the energy expenditure. The purpose of this study was, therefore, to assess the effects of rocker shoes on running economy in comparison with standard and
Aasta film - joonisfilm "Mont Blanc" / Verni Leivak
Leivak, Verni, 1966-
2002-01-01
Eesti Filmiajakirjanike Ühing andis aasta 2001 parima filmi tiitli Priit Tenderi joonisfilmile "Mont Blanc" : Eesti Joonisfilm 2001.Ka filmikriitikute eelistused kinodes ja televisioonis 2001. aastal näidatud filmide osas
Simulation and the Monte Carlo method
Rubinstein, Reuven Y
2016-01-01
Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...
Avariide kiuste Monte Carlosse / Aare Arula
Arula, Aare
2007-01-01
Vt. ka Tehnika dlja Vsehh nr. 3, lk. 26-27. 26. jaanuaril 1937 Tallinnast Monte Carlo tähesõidule startinud Karl Siitanit ja tema meeskonda ootasid ees seiklused, mis oleksid neile peaaegu elu maksnud
Avariide kiuste Monte Carlosse / Aare Arula
Arula, Aare
2007-01-01
Vt. ka Tehnika dlja Vsehh nr. 3, lk. 26-27. 26. jaanuaril 1937 Tallinnast Monte Carlo tähesõidule startinud Karl Siitanit ja tema meeskonda ootasid ees seiklused, mis oleksid neile peaaegu elu maksnud
Monte Carlo simulations for plasma physics
Energy Technology Data Exchange (ETDEWEB)
Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X. [National Inst. for Fusion Science, Toki, Gifu (Japan)
2000-07-01
Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)
Predator trapping on Monte Vista NWR
US Fish and Wildlife Service, Department of the Interior — This letter is summarizing the status of predator trapping on Monte Vista National Wildlife refuge in light of the referendum passes in the State of Colorado banning...
Quantum Monte Carlo Calculations of Light Nuclei
Pieper, Steven C
2007-01-01
During the last 15 years, there has been much progress in defining the nuclear Hamiltonian and applying quantum Monte Carlo methods to the calculation of light nuclei. I describe both aspects of this work and some recent results.
Improved Monte Carlo Renormalization Group Method
Gupta, R.; Wilson, K. G.; Umrigar, C.
1985-01-01
An extensive program to analyze critical systems using an Improved Monte Carlo Renormalization Group Method (IMCRG) being undertaken at LANL and Cornell is described. Here we first briefly review the method and then list some of the topics being investigated.
Monte Carlo methods for particle transport
Haghighat, Alireza
2015-01-01
The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...
Monte Vista NWR Water Use Report- 1964
US Fish and Wildlife Service, Department of the Interior — This report summarizes water use at Monte Vista NWR for 1964. The document includes summaries of 1964 water use, 1965 water program recommendations, and proposed...
Smart detectors for Monte Carlo radiative transfer
Baes, Maarten
2008-01-01
Many optimization techniques have been invented to reduce the noise that is inherent in Monte Carlo radiative transfer simulations. As the typical detectors used in Monte Carlo simulations do not take into account all the information contained in the impacting photon packages, there is still room to optimize this detection process and the corresponding estimate of the surface brightness distributions. We want to investigate how all the information contained in the distribution of impacting photon packages can be optimally used to decrease the noise in the surface brightness distributions and hence to increase the efficiency of Monte Carlo radiative transfer simulations. We demonstrate that the estimate of the surface brightness distribution in a Monte Carlo radiative transfer simulation is similar to the estimate of the density distribution in an SPH simulation. Based on this similarity, a recipe is constructed for smart detectors that take full advantage of the exact location of the impact of the photon pack...
Quantum Monte Carlo approaches for correlated systems
Becca, Federico
2017-01-01
Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...
Pheasant hunting on the Monte Vista NWR
US Fish and Wildlife Service, Department of the Interior — This letter to the Alamosa/Monte Vista NWR Refuge Manager discusses the need to alter management of pheasants in the area to halt the continued decline in population...
Aasta film - joonisfilm "Mont Blanc" / Verni Leivak
Leivak, Verni, 1966-
2002-01-01
Eesti Filmiajakirjanike Ühing andis aasta 2001 parima filmi tiitli Priit Tenderi joonisfilmile "Mont Blanc" : Eesti Joonisfilm 2001.Ka filmikriitikute eelistused kinodes ja televisioonis 2001. aastal näidatud filmide osas
Bartalini, P.; Kryukov, A.; Selyuzhenkov, Ilya V.; Sherstnev, A.; Vologdin, A.
2004-01-01
We present the Monte-Carlo events Data Base (MCDB) project and its development plans. MCDB facilitates communication between authors of Monte-Carlo generators and experimental users. It also provides a convenient book-keeping and an easy access to generator level samples. The first release of MCDB is now operational for the CMS collaboration. In this paper we review the main ideas behind MCDB and discuss future plans to develop this Data Base further within the CERN LCG framework.
Monte Carlo Algorithms for Linear Problems
DIMOV, Ivan
2000-01-01
MSC Subject Classification: 65C05, 65U05. Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the followin...
The Feynman Path Goes Monte Carlo
Sauer, Tilman
2001-01-01
Path integral Monte Carlo (PIMC) simulations have become an important tool for the investigation of the statistical mechanics of quantum systems. I discuss some of the history of applying the Monte Carlo method to non-relativistic quantum systems in path-integral representation. The principle feasibility of the method was well established by the early eighties, a number of algorithmic improvements have been introduced in the last two decades.
Monte Carlo Hamiltonian:Inverse Potential
Institute of Scientific and Technical Information of China (English)
LUO Xiang-Qian; CHENG Xiao-Ni; Helmut KR(O)GER
2004-01-01
The Monte Carlo Hamiltonian method developed recently allows to investigate the ground state and low-lying excited states of a quantum system,using Monte Carlo(MC)algorithm with importance sampling.However,conventional MC algorithm has some difficulties when applied to inverse potentials.We propose to use effective potential and extrapolation method to solve the problem.We present examples from the hydrogen system.
Self-consistent kinetic lattice Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Horsfield, A.; Dunham, S.; Fujitani, Hideaki
1999-07-01
The authors present a brief description of a formalism for modeling point defect diffusion in crystalline systems using a Monte Carlo technique. The main approximations required to construct a practical scheme are briefly discussed, with special emphasis on the proper treatment of charged dopants and defects. This is followed by tight binding calculations of the diffusion barrier heights for charged vacancies. Finally, an application of the kinetic lattice Monte Carlo method to vacancy diffusion is presented.
Energetics of running: a new perspective.
Kram, R; Taylor, C R
1990-07-19
The amount of energy used to run a mile is nearly the same whether it is run at top speed or at a leisurely pace (although it is used more rapidly at the higher speed). This puzzling independence of energy cost and speed is found generally among running animals, although, on a per gram basis, cost is much higher for smaller animals. Running involves little work against the environment; work is done by muscles and tendons to lift and accelerate the body and limbs. Some of the work is recovered from muscle-tendon springs without metabolic cost and work rate does not parallel metabolic rate with either speed or size. Regardless of the amount of work muscles do, they must be activated and develop force to support the weight of the body. Load-carrying experiments have shown that the cost of supporting an extra newton of load is the same as the weight-specific cost of running. Size differences in cost are proportional to stride frequency at equivalent speeds, suggesting that the time available for developing force is important in determining cost. We report a simple inverse relationship between the rate of energy used for running and the time the foot applies force to the ground during each stride. These results support the hypothesis that it is primarily the cost of supporting the animal's weight and the time course of generating this force that determines the cost of running.
Running Economy from a Muscle Energetics Perspective
Directory of Open Access Journals (Sweden)
Jared R. Fletcher
2017-06-01
Full Text Available The economy of running has traditionally been quantified from the mass-specific oxygen uptake; however, because fuel substrate usage varies with exercise intensity, it is more accurate to express running economy in units of metabolic energy. Fundamentally, the understanding of the major factors that influence the energy cost of running (Erun can be obtained with this approach. Erun is determined by the energy needed for skeletal muscle contraction. Here, we approach the study of Erun from that perspective. The amount of energy needed for skeletal muscle contraction is dependent on the force, duration, shortening, shortening velocity, and length of the muscle. These factors therefore dictate the energy cost of running. It is understood that some determinants of the energy cost of running are not trainable: environmental factors, surface characteristics, and certain anthropometric features. Other factors affecting Erun are altered by training: other anthropometric features, muscle and tendon properties, and running mechanics. Here, the key features that dictate the energy cost during distance running are reviewed in the context of skeletal muscle energetics.
Middle cerebral artery blood velocity during running.
Lyngeraa, T S; Pedersen, L M; Mantoni, T; Belhage, B; Rasmussen, L S; van Lieshout, J J; Pott, F C
2013-02-01
Running induces characteristic fluctuations in blood pressure (BP) of unknown consequence for organ blood flow. We hypothesized that running-induced BP oscillations are transferred to the cerebral vasculature. In 15 healthy volunteers, transcranial Doppler-determined middle cerebral artery (MCA) blood flow velocity, photoplethysmographic finger BP, and step frequency were measured continuously during three consecutive 5-min intervals of treadmill running at increasing running intensities. Data were analysed in the time and frequency domains. BP data for seven subjects and MCA velocity data for eight subjects, respectively, were excluded from analysis because of insufficient signal quality. Running increased mean arterial pressure and mean MCA velocity and induced rhythmic oscillations in BP and in MCA velocity corresponding to the difference between step rate and heart rate (HR) frequencies. During running, rhythmic oscillations in arterial BP induced by interference between HR and step frequency impact on cerebral blood velocity. For the exercise as a whole, average MCA velocity becomes elevated. These results suggest that running not only induces an increase in regional cerebral blood flow but also challenges cerebral autoregulation. © 2012 John Wiley & Sons A/S.
The First 24 Years of Reverse Monte Carlo Modelling, Budapest, Hungary, 20-22 September 2012
Keen, David A.; Pusztai, László
2013-11-01
-ray scattering and modeling studiesL Hawelek, A Brodka, J C Dore, V Honkimaki and A Burian Local structure correlations in plastic cyclohexane—a reverse Monte Carlo studyNicholas P Funnell, Martin T Dove, Andrew L Goodwin, Simon Parsons and Matthew G Tucker Neutron powder diffraction and molecular dynamics study of superionic SrBr2S Hull, S T Norberg, S G Eriksson and C E Mohn Atomic order and cluster energetics of a 17 wt% Si-based glass versus the liquid phaseG S E Antipas, L Temleitner, K Karalis, L Pusztai and A Xenidis Total scattering analysis of cation coordination and vacancy pair distribution in Yb substituted Ō-Bi2O3G S E Antipas, L Temleitner, K Karalis, L Pusztai and A Xenidis Modification of the sampling algorithm for reverse Monte Carlo modeling with an insufficient data setSatoshi Sato and Kenji Maruyama The origin of diffuse scattering in crystalline carbon tetraiodideTemleitner and L Pusztai Silver environment and covalent network rearrangement in GeS3-Ag glassesL Rátkai, I Kaban, T Wágner, J Kolár, S Valková, Iva Voleská, B Beuneu and P Jóvári Reverse Monte Carlo study of spherical sample under non-periodic boundary conditions: the structure of Ru nanoparticles based on x-ray diffraction dataOrsolya Gereben and Valeri Petkov Total neutron scattering investigation of the structure of a cobalt gallium oxide spinel prepared by solvothermal oxidation of gallium metalHelen Y Playford, Alex C Hannon, Matthew G Tucker, Martin R Lees and Richard I Walton The structure of water in solutions containing di- and trivalent cations by empirical potential structure refinementDaniel T Bowron and Sofia Díaz Moreno The proton conducting electrolyte BaTi0.5In0.5O2.75: determination of the deuteron site and its local environmentStefan T Norberg, Seikh M H Rahman, Stephen Hull, Christopher S Knee and Sten G Eriksson Acidic properties of aqueous phosphoric acid solutions: a microscopic viewI Harsányi, L Pusztai, P Jóvári and B Beuneu Comparison of the atomic level
Error in Monte Carlo, quasi-error in Quasi-Monte Carlo
Kleiss, R H
2006-01-01
While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction of an estimator of stochastic nature, based on the ensemble of pointsets with a particular discrepancy value. We investigate the consequences of this choice and give some first empirical results on the suggested estimators.
The physiology of deep-water running.
Reilly, Thomas; Dowzer, Clare N; Cable, N T
2003-12-01
Deep-water running is performed in the deep end of a swimming pool, normally with the aid of a flotation vest. The method is used for purposes of preventing injury and promoting recovery from strenuous exercise and as a form of supplementary training for cardiovascular fitness. Both stroke volume and cardiac output increase during water immersion: an increase in blood volume largely offsets the cardiac decelerating reflex at rest. At submaximal exercise intensities, blood lactate responses to exercise during deep-water running are elevated in comparison to treadmill running at a given oxygen uptake (VO2). While VO2, minute ventilation and heart rate are decreased under maximal exercise conditions in the water, deep-water running nevertheless can be justified as providing an adequate stimulus for cardiovascular training. Responses to training programmes have confirmed the efficacy of deep-water running, although positive responses are most evident when measured in a water-based test. Aerobic performance is maintained with deep-water running for up to 6 weeks in trained endurance athletes; sedentary individuals benefit more than athletes in improving maximal oxygen uptake. There is some limited evidence of improvement in anaerobic measures and in upper body strength in individuals engaging in deep-water running. A reduction in spinal loading constitutes a role for deep-water running in the prevention of injury, while an alleviation of muscle soreness confirms its value in recovery training. Further research into the applications of deep-water running to exercise therapy and athletes' training is recommended.
Implications of a Running Dark Photon Coupling
Davoudiasl, Hooman
2015-01-01
For an "invisible" dark photon $Z_d$ that dominantly decays into dark states, the running of its fine structure constant $\\alpha_d$ with momentum transfer $q > m_{Z_d}$ could be significant. A similar running in the kinetic mixing parameter $\\varepsilon^2$ can be induced through its dependence on $\\alpha_d(q)$. The running of couplings could potentially be detected in "dark matter beam" experiments, for which theoretical considerations imply $\\alpha_d (m_{Z_d}) \\lesssim 0.5$.
Gravitational Baryogenesis in Running Vacuum models
Oikonomou, V K; Nunes, Rafael C
2016-01-01
We study the gravitational baryogenesis mechanism for generating baryon asymmetry in the context of running vacuum models. Regardless if these models can produce a viable cosmological evolution, we demonstrate that they produce a non-zero baryon-to-entropy ratio even if the Universe is filled with conformal matter. This is a sound difference between the running vacuum gravitational baryogenesis and the Einstein-Hilbert one, since in the latter case, the predicted baryon-to-entropy ratio is zero. We consider two running vacuum models and show that the resulting baryon-to-entropy ratio is compatible with the observational data.
Reconstruction and Calibration of Small Radius Jets in the ATLAS Experiment for LHC Run 2
Loch, Peter; The ATLAS collaboration
2017-01-01
Small radius jets with R = 0.4 are standard tools in ATLAS for physics analysis. They are calibrated using a sequence of Monte Carlo simulation-derived calibrations and corrections followed by in-situ calibrations based on the transverse momentum balance between the probed jets and well-measured reference signals. In this talk the inputs to jet reconstruction in LHC Run 2 comprising calorimeter cell clusters, reconstructed charge particle tracks, and particle flow objects, are discussed together with the jet energy calibration scheme. Selected results from the performance of the procedure and the associated systematic uncertainties are presented.
Measurement and Monte Carlo Calculation of Waste Drum Filled With Radioactive Aqueous Solution
Institute of Scientific and Technical Information of China (English)
XU; Li-jun; ZHANG; Wei-dong; YE; Hong-sheng; LIN; Min; CHEN; Xi-lin; GUO; Xiao-qing
2012-01-01
<正>Theoretically the best calibrating source of gamma scan system (SGS) is a waste drum filled with uniform distribution of medium and radioactive nuclides. However, in reality, waste drums usually full of solid substance, which are difficult to be prepared in a completely uniformly distributed state. To reduce measurement uncertainty of the radioactivity of waste drums prepared using the method of shell source, a waste drum filled with radioactive aqueous solution was prepared. Besides, its radioactivity was measured by a SGS device and calculated using Monte Carlo method to verify the exact geometric model, which
Monte Carlo EM加速算法%Acceleration of Monte Carlo EM Algorithm
Institute of Scientific and Technical Information of China (English)
罗季
2008-01-01
EM算法是近年来常用的求后验众数的估计的一种数据增广算法,但由于求出其E步中积分的显示表达式有时很困难,甚至不可能,限制了其应用的广泛性.而Monte Carlo EM算法很好地解决了这个问题,将EM算法中E步的积分用Monte Carlo模拟来有效实现,使其适用性大大增强.但无论是EM算法,还是Monte Carlo EM算法,其收敛速度都是线性的,被缺损信息的倒数所控制,当缺损数据的比例很高时,收敛速度就非常缓慢.而Newton-Raphson算法在后验众数的附近具有二次收敛速率.本文提出Monte Carlo EM加速算法,将Monte Carlo EM算法与Newton-Raphson算法结合,既使得EM算法中的E步用Monte Carlo模拟得以实现,又证明了该算法在后验众数附近具有二次收敛速度.从而使其保留了Monte Carlo EM算法的优点,并改进了Monte Carlo EM算法的收敛速度.本文通过数值例子,将Monte Carlo EM加速算法的结果与EM算法、Monte Carlo EM算法的结果进行比较,进一步说明了Monte Carlo EM加速算法的优良性.
ALFA detector upgrade before LHC Run 2
Vorobel, Vit; The ATLAS collaboration
2016-01-01
The operation experience with ATLAS ALFA detectors in the LHC environment during the Run1 period has shown significant beam-induced heating. Subsequent comprehensive studies revealed that heating effects could be disastrous in the case of the larger beam intensities foreseen for higher luminosities in the LHC Run2. During the first LHC long shutdown (LS1) all ALFA detectors have been removed from the LHC tunnel and their covers - Roman Pots - underwent a geometry upgrade to minimize the impedance losses. It will be shown that this modification together with a system improving the internal heat transfer and an air cooling system, significantly shifted the temperatures of ALFA detectors away from the critical limits throughout the LHC Run2. Also ALFA trigger system was considerably upgraded to keep measured data safely inside the Run2 ATLAS latency budget and to minimize dead time. The needed hardware changes of the trigger system will be presented in the second part of the talk.
Common running musculoskeletal injuries among recreational half ...
African Journals Online (AJOL)
Data were collected from runners (N=200) who officially ran half-marathon road ... Department of Sport Science, School of Physiotherapy, Sport Science and ..... Van Mechelen W. Running injuries: A review of the epidemiological literature.
Energy Technology Data Exchange (ETDEWEB)
Connolly, R. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dawson, C. [Brookhaven National Lab. (BNL), Upton, NY (United States); Jao, S. [Brookhaven National Lab. (BNL), Upton, NY (United States); Schoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tepikian, S. [Brookhaven National Lab. (BNL), Upton, NY (United States)
2016-08-05
Three problems with the eIPMs were corrected during the 2015 summer shutdown. These involved ac coupling and 'negative profiles', detector 'dead zone' created by biasing, and gain control on ramp. With respect to Run 16, problems dealt with included gain depletion on horizontal MCP and rf pickup on profile signals; it was found that the MCP was severely damaged over part of the aperture. Various corrective measures were applied. Some results of these measured obtained during Run 16 are shown. At the end of Run 16 there was a three-day beam run to study polarized proton beams in the AGS. Attempts to minimize beam injection errors which increase emittance by using the eIPMs to measure the contribution of injection mismatch to the AGS output beam emittance are recounted. .
ALFA detector before LHC Run 2
Vorobel, Vit; The ATLAS collaboration
2016-01-01
The operation experience with ATLAS ALFA detectors in the LHC environment during the Run1 period has shown significant beam-induced heating. Subsequent comprehensive studies revealed that heating effects could be disastrous in the case of the larger beam intensities foreseen for higher luminosities in the LHC Run2. During the first LHC long shutdown (LS1) all ALFA detectors have been removed from the LHC tunnel and their covers - Roman Pots - underwent a geometry upgrade to minimize the impedance losses. It will be shown that this modification together with a system improving the internal heat transfer and an air cooling system, significantly shifted the temperatures of ALFA detectors away from the critical limits throughout the LHC Run2. Also ALFA trigger system was considerably upgraded to keep measured data safely inside the Run2 ATLAS latency budget and to minimize dead time. The needed hardware changes of the trigger system are also described
U.S. Geological Survey, Department of the Interior — The data are input data files to run the forest simulation model Landis-II for Isle Royale National Park. Files include: a) Initial_Comm, which includes the location...
On the Assessment of Monte Carlo Error in Simulation-Based Statistical Analyses.
Koehler, Elizabeth; Brown, Elizabeth; Haneuse, Sebastien J-P A
2009-05-01
Statistical experiments, more commonly referred to as Monte Carlo or simulation studies, are used to study the behavior of statistical methods and measures under controlled situations. Whereas recent computing and methodological advances have permitted increased efficiency in the simulation process, known as variance reduction, such experiments remain limited by their finite nature and hence are subject to uncertainty; when a simulation is run more than once, different results are obtained. However, virtually no emphasis has been placed on reporting the uncertainty, referred to here as Monte Carlo error, associated with simulation results in the published literature, or on justifying the number of replications used. These deserve broader consideration. Here we present a series of simple and practical methods for estimating Monte Carlo error as well as determining the number of replications required to achieve a desired level of accuracy. The issues and methods are demonstrated with two simple examples, one evaluating operating characteristics of the maximum likelihood estimator for the parameters in logistic regression and the other in the context of using the bootstrap to obtain 95% confidence intervals. The results suggest that in many settings, Monte Carlo error may be more substantial than traditionally thought.
Monte Carlo simulation for kinetic chemotaxis model: An application to the traveling population wave
Yasuda, Shugo
2017-02-01
A Monte Carlo simulation of chemotactic bacteria is developed on the basis of the kinetic model and is applied to a one-dimensional traveling population wave in a microchannel. In this simulation, the Monte Carlo method, which calculates the run-and-tumble motions of bacteria, is coupled with a finite volume method to calculate the macroscopic transport of the chemical cues in the environment. The simulation method can successfully reproduce the traveling population wave of bacteria that was observed experimentally and reveal the microscopic dynamics of bacterium coupled with the macroscopic transports of the chemical cues and bacteria population density. The results obtained by the Monte Carlo method are also compared with the asymptotic solution derived from the kinetic chemotaxis equation in the continuum limit, where the Knudsen number, which is defined by the ratio of the mean free path of bacterium to the characteristic length of the system, vanishes. The validity of the Monte Carlo method in the asymptotic behaviors for small Knudsen numbers is numerically verified.
Run 1 Legacy Performance : electrons/photons
Damazio, D O; The ATLAS collaboration
2014-01-01
In this talk, the run 1 legacy performance of the electron and photon reconstruction and identification in the ATLAS and CMS experiments will be described, as well as the associated systematic uncertainties. The two speakers should try to enlight the differences of performances between the two experiments, and explain what worked better/worse than planned, as well as the lessons for the run 2.
Performance of boosted object and jet substructure techniques in Run 1 and 2 ATLAS data
Schramm, Steven; The ATLAS collaboration
2016-01-01
Hadronic decays of heavy particles with momenta much larger than their mass result in their decay products being reconstructable as a single large-radius jet. The study of the substructure of these jets allows the separation of these boosted decays with respect to more common jets from light-quarks and gluons. Several techniques have been developed by the phenomenology and experimental community to identify jets coming from hadronic decays of boosted top quarks, W, Z and Higgs bosons. The performance of several such techniques have been studied in ATLAS using fully-simulated Monte Carlo events, and validated on data using pure samples of top quarks, W bosons from top decays and dijet events. Results of these studies will be presented for Run 1 as well as Run 2 of the LHC.
Evolution of ATLAS conditions data and its management for LHC run-2
Boehler, Michael; The ATLAS collaboration; Gallas, Elizabeth; Formica, Andrea; Borodin, Mikhail
2015-01-01
The ATLAS detector consists of several sub-detector systems. Both data taking and Monte Carlo (MC) simulation rely on an accurate description of the detector conditions from every sub system, such as calibration constants, different scenarios of pile-up and noise conditions, size and position of the beam spot, etc. In order to guarantee database availability for critical online applications during data-taking, two database systems, one for online access and another one for all other database access have been implemented. The long shutdown period has provided the opportunity to review and improve the run-1 system: revise workflows, include new and innovative monitoring and maintenance tools and implement a new database instance for run-2 conditions data. The detector conditions are organized by tag identification strings and managed independently from the different sub-detector experts. The individual tags are then collected and associated into a global conditions tag, assuring synchronization of various sub-d...
Evolution of ATLAS conditions data and its management for LHC Run-2
Boehler, Michael; Formica, Andrea; Gallas, Elizabeth; Radescu, Voica
2015-01-01
The ATLAS detector at the LHC consists of several sub-detector systems. Both data taking and Monte Carlo (MC) simulation rely on an accurate description of the detector conditions from every subsystem, such as calibration constants, different scenarios of pile-up and noise conditions, size and position of the beam spot, etc. In order to guarantee database availability for critical online applications during data-taking, two database systems, one for online access and another one for all other database access have been implemented. The long shutdown period has provided the opportunity to review and improve the Run-1 system: revise workflows, include new and innovative monitoring and maintenance tools and implement a new database instance for Run-2 conditions data. The detector conditions are organized by tag identification strings and managed independently from the different sub-detector experts. The individual tags are then collected and associated into a global conditions tag, assuring synchronization of var...
On the distribution and swim pressure of run-and-tumble particles in confinement
Ezhilan, Barath; Saintillan, David
2015-01-01
The spatial and orientational distribution in a dilute active suspension of non-Brownian run-and-tumble spherical swimmers confined between two planar hard walls is calculated theoretically. Using a kinetic model based on coupled bulk/surface probability density functions, we demonstrate the existence of a concentration wall boundary layer with thickness scaling with the run length, the absence of polarization throughout the channel, and the presence of sharp discontinuities in the bulk orientation distribution in the neighborhood of orientations parallel to the wall in the near-wall region. Our model is also applied to calculate the swim pressure in the system, which approaches the previously proposed ideal-gas behavior in wide channels but is found to decrease in narrow channels as a result of confinement. Monte-Carlo simulations are also performed for validation and show excellent quantitative agreement with our theoretical predictions.
Metadata aided run selection at ATLAS
Buckingham, R. M.; Gallas, E. J.; C-L Tseng, J.; Viegas, F.; Vinek, E.; ATLAS Collaboration
2011-12-01
Management of the large volume of data collected by any large scale scientific experiment requires the collection of coherent metadata quantities, which can be used by reconstruction or analysis programs and/or user interfaces, to pinpoint collections of data needed for specific purposes. In the ATLAS experiment at the LHC, we have collected metadata from systems storing non-event-wise data (Conditions) into a relational database. The Conditions metadata (COMA) database tables not only contain conditions known at the time of event recording, but also allow for the addition of conditions data collected as a result of later analysis of the data (such as improved measurements of beam conditions or assessments of data quality). A new web based interface called "runBrowser" makes these Conditions Metadata available as a Run based selection service. runBrowser, based on PHP and JavaScript, uses jQuery to present selection criteria and report results. It not only facilitates data selection by conditions attributes, but also gives the user information at each stage about the relationship between the conditions chosen and the remaining conditions criteria available. When a set of COMA selections are complete, runBrowser produces a human readable report as well as an XML file in a standardized ATLAS format. This XML can be saved for later use or refinement in a future runBrowser session, shared with physics/detector groups, or used as input to ELSSI (event level Metadata browser) or other ATLAS run or event processing services.
Calcaneal loading during walking and running
Giddings, V. L.; Beaupre, G. S.; Whalen, R. T.; Carter, D. R.
2000-01-01
PURPOSE: This study of the foot uses experimentally measured kinematic and kinetic data with a numerical model to evaluate in vivo calcaneal stresses during walking and running. METHODS: External ground reaction forces (GRF) and kinematic data were measured during walking and running using cineradiography and force plate measurements. A contact-coupled finite element model of the foot was developed to assess the forces acting on the calcaneus during gait. RESULTS: We found that the calculated force-time profiles of the joint contact, ligament, and Achilles tendon forces varied with the time-history curve of the moment about the ankle joint. The model predicted peak talocalcaneal and calcaneocuboid joint loads of 5.4 and 4.2 body weights (BW) during walking and 11.1 and 7.9 BW during running. The maximum predicted Achilles tendon forces were 3.9 and 7.7 BW for walking and running. CONCLUSIONS: Large magnitude forces and calcaneal stresses are generated late in the stance phase, with maximum loads occurring at approximately 70% of the stance phase during walking and at approximately 60% of the stance phase during running, for the gait velocities analyzed. The trajectories of the principal stresses, during both walking and running, corresponded to each other and qualitatively to the calcaneal trabecular architecture.
Running With an Elastic Lower Limb Exoskeleton.
Cherry, Michael S; Kota, Sridhar; Young, Aaron; Ferris, Daniel P
2016-06-01
Although there have been many lower limb robotic exoskeletons that have been tested for human walking, few devices have been tested for assisting running. It is possible that a pseudo-passive elastic exoskeleton could benefit human running without the addition of electrical motors due to the spring-like behavior of the human leg. We developed an elastic lower limb exoskeleton that added stiffness in parallel with the entire lower limb. Six healthy, young subjects ran on a treadmill at 2.3 m/s with and without the exoskeleton. Although the exoskeleton was designed to provide ~50% of normal leg stiffness during running, it only provided 24% of leg stiffness during testing. The difference in added leg stiffness was primarily due to soft tissue compression and harness compliance decreasing exoskeleton displacement during stance. As a result, the exoskeleton only supported about 7% of the peak vertical ground reaction force. There was a significant increase in metabolic cost when running with the exoskeleton compared with running without the exoskeleton (ANOVA, P exoskeletons for human running are human-machine interface compliance and the extra lower limb inertia from the exoskeleton.
Institute of Scientific and Technical Information of China (English)
雷咏梅; 蒋英; 冯捷
2002-01-01
This paper presents a new approach to parallelize 3D lattice Monte Carlo algorithms used in the numerical simulation of polymer on ZiQiang 2000-a cluster of symmetric multiprocessors (SMPs). The combined load for cell and energy calculations over the time step is balanced together to form a single spatial decomposition. Basic aspects and strategies of running Monte Carlo calculations on parallel computers are studied. Different steps involved in porting the software on a parallel architecture based on ZiQiang 2000 running under Linux and MPI are described briefly. It is found that parallelization becomes more advantageous when either the lattice is very large or the model contains many cells and chains.
Mechanical spring technology improves running economy in endurance runners
Riess, Kenneth James
2014-01-01
In recent years there has been an increase in participation in timed running events. With this increase, the motivation for individuals to run their best has motivated the running shoe industry to make design changes to traditional running foot wear in an effort to improve running economy (RE) and decrease running times. One such design change has been to incorporate mechanical springs (MS) into the midsole of the running shoe. Evaluation of this technology has yet to be performed. This study...
Running kinematics and shock absorption do not change after brief exhaustive running.
Abt, John P; Sell, Timothy C; Chu, Yungchien; Lovalekar, Mita; Burdett, Ray G; Lephart, Scott M
2011-06-01
Because of the nature of running, the forces encountered require a proper coordination of joint action of the lower extremity to dissipate the ground reaction forces and accelerations through the kinetic chain. Running-related muscle fatigue may reduce the shock absorbing capacity of the lower extremity and alter running kinematics. The purpose of this study was to determine if a bout of exhaustive running at a physiologically determined high intensity, changes running kinematics, impact accelerations, and alters shock attenuating capabilities. It was hypothesized that as a result of fatigue induced by an exhaustive run, running kinematics, impact accelerations at the head and shank, acceleration reduction, and shock attenuation would change. A within-subject, repeated-measures design was used for this study. Twelve healthy, competitive male and female distance runners participated. Subjects performed 2 testing sessions consisting of a VO2max treadmill protocol to determine the heart rate at ventilatory threshold and a fatigue-inducing running bout at the identified ventilatory threshold heart rate. Kinematic data included knee flexion, pronation, time to maximum knee flexion, and time to maximum pronation. Acceleration data included shank acceleration, head acceleration, and shock attenuation. No significant differences resulted for the kinematic or acceleration variables. Although the results of this study do not support the original hypotheses, the influence of running fatigue on kinematics and accelerations remains inconclusive. Future research is necessary to examine fatigue-induced changes in running kinematics and accelerations and to determine the threshold at which point the changes may occur.
Hartnett, Michael; Ren, Lei
2013-04-01
This paper describes the application of Ensemble Optimal Interpolation (EnOI) with Monte Carlo (MC) simulation for surface currents forecasting. Environment Fluid Dynamics Codes (EFDC) is run for 7 days with initial conditions and boundary conditions. For the assimilation process, Direct Insertion (DI), Optimal Interpolation (OI) and Ensemble Optimal Interpolation (EnOI) approaches are applied from t=5.0d, and wind forcing is switched off during updating process. For Optimal Interpolation, background error covariance is estimated from the first run combining empirical correlation function, while for Ensemble Optimal Interpolation, background error covariance is calculated from the ensemble of first run, optimal number of ensemble is acquired by comparing different assimilation. Different strategies have been proposed to obtain the measurement error covariance, optimal measurement error covariance gives the least forecast error. Different kinds of pseudo measurements are produced from Monte Carlo simulation by adding different type of perturbations, which obey certain distribution. A series of experiments with distinct perturbations are carried out to show the improvement of simulating the stochastic process. Three types of reference points: inside of the assimilation area, outside of the assimilation area, and the boundary points are analyzed to show the improvement of the assimilation process and the influence after assimilation. This study also investigates the impacts of the updating interval for the assimilation process, the felicitous updating interval is chosen by comparison. To compare the improvement of operating Ensemble Optimal Interpolation with Direct Insertion and Optimal Interpolation, RMS error and data assimilation skill are calculated.
Jiang, Yi-fan; Chen, Chang-shui; Liu, Xiao-mei; Liu, Rong-ting; Liu, Song-hao
2015-04-01
To explore the characteristics of light propagation along the Pericardium Meridian and its surrounding areas at human wrist by using optical experiment and Monte Carlo method. An experiment was carried out to obtain the distribution of diffuse light on Pericardium Meridian line and its surrounding areas at the wrist, and then a simplified model based on the anatomical structure was proposed to simulate the light transportation within the same area by using Monte Carlo method. The experimental results showed strong accordance with the Monte Carlo simulation that the light propagation along the Pericardium Meridian had an advantage over its surrounding areas at the wrist. The advantage of light transport along Pericardium Merdian line was related to components and structure of tissue, also the anatomical structure of the area that the Pericardium Meridian line runs.
Stolarski, R. S.; Butler, D. M.; Rundel, R. D.
1977-01-01
A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.
GRS' research on clay rock in the Mont Terri underground laboratory
Energy Technology Data Exchange (ETDEWEB)
Wieczorek, Klaus; Czaikowski, Oliver [Gesellschaft fuer Anlagen- und Reaktorsicherheit gGmbH, Braunschweig (Germany)
2016-07-15
For constructing a nuclear waste repository and for ensuring the safety requirements are met over very long time periods, thorough knowledge about the safety-relevant processes occurring in the coupled system of waste containers, engineered barriers, and the host rock is indispensable. For respectively targeted research work, the Mont Terri rock laboratory is a unique facility where repository research is performed in a clay rock environment. It is run by 16 international partners, and a great variety of questions are investigated. Some of the work which GRS as one of the Mont Terri partners is involved in is presented in this article. The focus is on thermal, hydraulic and mechanical behaviour of host rock and/or engineered barriers.
Yasuda, Shugo
2015-01-01
A Monte Carlo simulation for the chemotactic bacteria is developed on the basis of the kinetic modeling, i.e., the Boltzmann transport equation, and applied to the one-dimensional traveling population wave in a micro channel.In this method, the Monte Carlo method, which calculates the run-and-tumble motions of bacteria, is coupled with a finite volume method to solve the macroscopic transport of the chemical cues in the field. The simulation method can successfully reproduce the traveling population wave of bacteria which was observed experimentally. The microscopic dynamics of bacteria, e.g., the velocity autocorrelation function and velocity distribution function of bacteria, are also investigated. It is found that the bacteria which form the traveling population wave create quasi-periodic motions as well as a migratory movement along with the traveling population wave. Simulations are also performed with changing the sensitivity and modulation parameters in the response function of bacteria. It is found th...
Energy Technology Data Exchange (ETDEWEB)
T.J. Urbatsch; T.M. Evans
2006-02-15
We have released Version 2 of Milagro, an object-oriented, C++ code that performs radiative transfer using Fleck and Cummings' Implicit Monte Carlo method. Milagro, a part of the Jayenne program, is a stand-alone driver code used as a methods research vehicle and to verify its underlying classes. These underlying classes are used to construct Implicit Monte Carlo packages for external customers. Milagro-2 represents a design overhaul that allows better parallelism and extensibility. New features in Milagro-2 include verified momentum deposition, restart capability, graphics capability, exact energy conservation, and improved load balancing and parallel efficiency. A users' guide also describes how to configure, make, and run Milagro2.
Quasi-Monte Carlo methods for lattice systems: A first look
Jansen, K.; Leovey, H.; Ammon, A.; Griewank, A.; Müller-Preussker, M.
2014-03-01
Carlo, and especially Markov chain-Monte Carlo methods like the Metropolis or the hybrid Monte Carlo algorithm have been used to calculate approximate solutions of the path integral. These algorithms often lead to the undesired effect of autocorrelation in the samples of observables and suffer in any case from the slow asymptotic error behavior proportional to N, if N is the number of samples. Solution method: This program applies the quasi-Monte Carlo approach and the reweighting technique (respectively the weighted uniform sampling method) to generate uncorrelated samples of observables of the anharmonic oscillator with an improved asymptotic error behavior. Unusual features: The application of the quasi-Monte Carlo approach is quite revolutionary in the field of lattice field theories. Running time: The running time depends directly on the number of samples N and dimensions d. On modern computers a run with up to N=216=65536 (including 9 replica runs) and d=100 should not take much longer than one minute.
Massively parallel Monte Carlo for many-particle simulations on GPUs
Anderson, Joshua A; Grubb, Thomas L; Engel, Michael; Glotzer, Sharon C
2013-01-01
Current trends in parallel processors call for the design of efficient massively parallel algorithms for scientific computing. Parallel algorithms for Monte Carlo simulations of thermodynamic ensembles of particles have received little attention because of the inherent serial nature of the statistical sampling. In this paper, we present a massively parallel method that obeys detailed balance and implement it for a system of hard disks on the GPU. We reproduce results of serial high-precision Monte Carlo runs to verify the method. This is a good test case because the hard disk equation of state over the range where the liquid transforms into the solid is particularly sensitive to small deviations away from the balance conditions. On a GeForce GTX 680, our GPU implementation executes 95 times faster than on a single Intel Xeon E5540 CPU core, enabling 17 times better performance per dollar and cutting energy usage by a factor of 10.
O'Hagan, Anthony; Stevenson, Matt; Madan, Jason
2007-10-01
Probabilistic sensitivity analysis (PSA) is required to account for uncertainty in cost-effectiveness calculations arising from health economic models. The simplest way to perform PSA in practice is by Monte Carlo methods, which involves running the model many times using randomly sampled values of the model inputs. However, this can be impractical when the economic model takes appreciable amounts of time to run. This situation arises, in particular, for patient-level simulation models (also known as micro-simulation or individual-level simulation models), where a single run of the model simulates the health care of many thousands of individual patients. The large number of patients required in each run to achieve accurate estimation of cost-effectiveness means that only a relatively small number of runs is possible. For this reason, it is often said that PSA is not practical for patient-level models. We develop a way to reduce the computational burden of Monte Carlo PSA for patient-level models, based on the algebra of analysis of variance. Methods are presented to estimate the mean and variance of the model output, with formulae for determining optimal sample sizes. The methods are simple to apply and will typically reduce the computational demand very substantially.
Energy Technology Data Exchange (ETDEWEB)
Bruscino, Nello; Cristinziani, Markus; Ghneimat, Mazuza; Heer, Sebastian; Kostyukhin, Vadim; Machefer, Evan; Mijovic, Liza; Yau Wong, Kaven [Physikalisches Institut, Universitaet Bonn (Germany)
2015-07-01
Predictions from several Monte-Carlo generators are compared for the t anti t production. The predictions are also compared to the data taken by ATLAS in 2011 at √(s)=7 TeV. The focus is on observables sensitive to additional parton radiation: jet multiplicities and gap fraction observables. Generators that have been used for ATLAS analyses of the data collected in the first LHC proton physics run as well as new generators that will be used in the upcoming LHC run are included. The goal of the work is to collect information and studies for discussions between the communities of the ATLAS and CMS experiments and colleagues from theory.
Approaching Chemical Accuracy with Quantum Monte Carlo
Petruzielo, F R; Umrigar, C J
2012-01-01
A quantum Monte Carlo study of the atomization energies for the G2 set of molecules is presented. Basis size dependence of diffusion Monte Carlo atomization energies is studied with a single determinant Slater-Jastrow trial wavefunction formed from Hartree-Fock orbitals. With the largest basis set, the mean absolute deviation from experimental atomization energies for the G2 set is 3.0 kcal/mol. Optimizing the orbitals within variational Monte Carlo improves the agreement between diffusion Monte Carlo and experiment, reducing the mean absolute deviation to 2.1 kcal/mol. Moving beyond a single determinant Slater-Jastrow trial wavefunction, diffusion Monte Carlo with a small complete active space Slater-Jastrow trial wavefunction results in near chemical accuracy. In this case, the mean absolute deviation from experimental atomization energies is 1.2 kcal/mol. It is shown from calculations on systems containing phosphorus that the accuracy can be further improved by employing a larger active space.
Biomechanics of sprint running. A review.
Mero, A; Komi, P V; Gregor, R J
1992-06-01
Understanding of biomechanical factors in sprint running is useful because of their critical value to performance. Some variables measured in distance running are also important in sprint running. Significant factors include: reaction time, technique, electromyographic (EMG) activity, force production, neural factors and muscle structure. Although various methodologies have been used, results are clear and conclusions can be made. The reaction time of good athletes is short, but it does not correlate with performance levels. Sprint technique has been well analysed during acceleration, constant velocity and deceleration of the velocity curve. At the beginning of the sprint run, it is important to produce great force/power and generate high velocity in the block and acceleration phases. During the constant-speed phase, the events immediately before and during the braking phase are important in increasing explosive force/power and efficiency of movement in the propulsion phase. There are no research results available regarding force production in the sprint-deceleration phase. The EMG activity pattern of the main sprint muscles is described in the literature, but there is a need for research with highly skilled sprinters to better understand the simultaneous operation of many muscles. Skeletal muscle fibre characteristics are related to the selection of talent and the training-induced effects in sprint running. Efficient sprint running requires an optimal combination between the examined biomechanical variables and external factors such as footwear, ground and air resistance. Further research work is needed especially in the area of nervous system, muscles and force and power production during sprint running. Combining these with the measurements of sprinting economy and efficiency more knowledge can be achieved in the near future.
The ATLAS Tau Trigger Performance during LHC Run 1 and Prospects for Run 2
Mitani, T; The ATLAS collaboration
2016-01-01
The ATLAS tau trigger is designed to select hadronic decays of the tau leptons. Tau lepton plays an important role in Standard Model (SM) physics, such as in Higgs boson decays. Tau lepton is also important in beyond the SM (BSM) scenarios, such as supersymmetry and exotic particles, as they are often produced preferentially in these models. During the 2010-2012 LHC run (Run1), the tau trigger was accomplished successfully, which leads several rewarding results such as evidence for $H\\rightarrow \\tau\\tau$. From the 2015 LHC run (Run2), LHC will be upgraded and overlapping interactions per bunch crossing (pile-up) are expected to increase by a factor two. It will be challenging to control trigger rates while keeping interesting physics events. This paper summarized the tau trigger performance in Run1 and its prospects for Run2.
MONTE-CARLO BURNUP CALCULATION UNCERTAINTY QUANTIFICATION AND PROPAGATION DETERMINATION
Energy Technology Data Exchange (ETDEWEB)
Nichols, T.; Sternat, M.; Charlton, W.
2011-05-08
MONTEBURNS is a Monte-Carlo depletion routine utilizing MCNP and ORIGEN 2.2. Uncertainties exist in the MCNP transport calculation, but this information is not passed to the depletion calculation in ORIGEN or saved. To quantify this transport uncertainty and determine how it propagates between burnup steps, a statistical analysis of a multiple repeated depletion runs is performed. The reactor model chosen is the Oak Ridge Research Reactor (ORR) in a single assembly, infinite lattice configuration. This model was burned for a 25.5 day cycle broken down into three steps. The output isotopics as well as effective multiplication factor (k-effective) were tabulated and histograms were created at each burnup step using the Scott Method to determine the bin width. It was expected that the gram quantities and k-effective histograms would produce normally distributed results since they were produced from a Monte-Carlo routine, but some of results do not. The standard deviation at each burnup step was consistent between fission product isotopes as expected, while the uranium isotopes created some unique results. The variation in the quantity of uranium was small enough that, from the reaction rate MCNP tally, round off error occurred producing a set of repeated results with slight variation. Statistical analyses were performed using the {chi}{sup 2} test against a normal distribution for several isotopes and the k-effective results. While the isotopes failed to reject the null hypothesis of being normally distributed, the {chi}{sup 2} statistic grew through the steps in the k-effective test. The null hypothesis was rejected in the later steps. These results suggest, for a high accuracy solution, MCNP cell material quantities less than 100 grams and greater kcode parameters are needed to minimize uncertainty propagation and minimize round off effects.
GATE Monte Carlo simulation in a cloud computing environment
Rowedder, Blake Austin
The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.
The design of the run Clever randomized trial
DEFF Research Database (Denmark)
Ramskov, Daniel; Nielsen, Rasmus Oestergaard; Sørensen, Henrik
2016-01-01
evidence-based running schedules to minimize the risk of injury. The existing literature on running volume and running intensity and the development of injuries show conflicting results. This may be related to previously applied study designs, methods used to quantify the performed running...... and the statistical analysis of the collected data. The aim of the Run Clever trial is to investigate if a focus on running intensity compared with a focus on running volume in a running schedule influences the overall injury risk differently. METHODS/DESIGN: The Run Clever trial is a randomized trial with a 24-week...
Energy Technology Data Exchange (ETDEWEB)
Vergnaud, T.; Nimal, J.C. (CEA Centre d' Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette (France))
1990-01-01
The three-dimensional polycinetic Monte Carlo particle transport code TRIPOLI has been under development in the French Shielding Laboratory at Saclay since 1965. TRIPOLI-1 began to run in 1970 and became TRIPOLI-2 in 1978: since then its capabilities have been improved and many studies have been performed. TRIPOLI can treat stationary or time dependent problems in shielding and in neutronics. Some examples of solved problems are presented to demonstrate the many possibilities of the system. (author).
Energy Technology Data Exchange (ETDEWEB)
Graf, Peter A.; Stewart, Gordon; Lackner, Matthew; Dykes, Katherine; Veers, Paul
2016-05-01
Long-term fatigue loads for floating offshore wind turbines are hard to estimate because they require the evaluation of the integral of a highly nonlinear function over a wide variety of wind and wave conditions. Current design standards involve scanning over a uniform rectangular grid of metocean inputs (e.g., wind speed and direction and wave height and period), which becomes intractable in high dimensions as the number of required evaluations grows exponentially with dimension. Monte Carlo integration offers a potentially efficient alternative because it has theoretical convergence proportional to the inverse of the square root of the number of samples, which is independent of dimension. In this paper, we first report on the integration of the aeroelastic code FAST into NREL's systems engineering tool, WISDEM, and the development of a high-throughput pipeline capable of sampling from arbitrary distributions, running FAST on a large scale, and postprocessing the results into estimates of fatigue loads. Second, we use this tool to run a variety of studies aimed at comparing grid-based and Monte Carlo-based approaches with calculating long-term fatigue loads. We observe that for more than a few dimensions, the Monte Carlo approach can represent a large improvement in computational efficiency, but that as nonlinearity increases, the effectiveness of Monte Carlo is correspondingly reduced. The present work sets the stage for future research focusing on using advanced statistical methods for analysis of wind turbine fatigue as well as extreme loads.
Short-run and long-run effect of oil consumption on economic growth: ECM model
Directory of Open Access Journals (Sweden)
Sofyan Syahnur
2014-04-01
Full Text Available The aim of this study is to investigate the effect of oil consumption on economic growth of Aceh in the long-run and short-run by using Error Correction Model (ECM model during the period before the world commodity prices fall of 1985–2008. Four types of oil consumption will be focused on Avtur, Gasoline, Kerosene and Diesel. The data is collected from Central Bureau of Statistics of Aceh (BPS Aceh. The result of this study shows a merely positive effect of oil consumption type diesel to economic growth in Aceh both in the short run and the long run.
Random Numbers and Monte Carlo Methods
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
SMCTC: Sequential Monte Carlo in C++
Directory of Open Access Journals (Sweden)
Adam M. Johansen
2009-04-01
Full Text Available Sequential Monte Carlo methods are a very general class of Monte Carlo methodsfor sampling from sequences of distributions. Simple examples of these algorithms areused very widely in the tracking and signal processing literature. Recent developmentsillustrate that these techniques have much more general applicability, and can be appliedvery eectively to statistical inference problems. Unfortunately, these methods are oftenperceived as being computationally expensive and dicult to implement. This articleseeks to address both of these problems.A C++ template class library for the ecient and convenient implementation of verygeneral Sequential Monte Carlo algorithms is presented. Two example applications areprovided: a simple particle lter for illustrative purposes and a state-of-the-art algorithmfor rare event estimation.
Shell model the Monte Carlo way
Energy Technology Data Exchange (ETDEWEB)
Ormand, W.E.
1995-03-01
The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.
Quantum Monte Carlo with variable spins.
Melton, Cody A; Bennett, M Chandler; Mitas, Lubos
2016-06-28
We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo, we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn2 molecules, as well as the electron affinities of the 6p row elements in close agreement with experiments.
Quantum Monte Carlo with Variable Spins
Melton, Cody A; Mitas, Lubos
2016-01-01
We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo (FPSODMC), we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn$_2$ molecules, as well as the electron affinities of the 6$p$ row elements in close agreement with experiments.
CosmoPMC: Cosmology Population Monte Carlo
Kilbinger, Martin; Cappe, Olivier; Cardoso, Jean-Francois; Fort, Gersende; Prunet, Simon; Robert, Christian P; Wraith, Darren
2011-01-01
We present the public release of the Bayesian sampling algorithm for cosmology, CosmoPMC (Cosmology Population Monte Carlo). CosmoPMC explores the parameter space of various cosmological probes, and also provides a robust estimate of the Bayesian evidence. CosmoPMC is based on an adaptive importance sampling method called Population Monte Carlo (PMC). Various cosmology likelihood modules are implemented, and new modules can be added easily. The importance-sampling algorithm is written in C, and fully parallelised using the Message Passing Interface (MPI). Due to very little overhead, the wall-clock time required for sampling scales approximately with the number of CPUs. The CosmoPMC package contains post-processing and plotting programs, and in addition a Monte-Carlo Markov chain (MCMC) algorithm. The sampling engine is implemented in the library pmclib, and can be used independently. The software is available for download at http://www.cosmopmc.info.
Quantum speedup of Monte Carlo methods.
Montanaro, Ashley
2015-09-08
Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.
Adiabatic optimization versus diffusion Monte Carlo methods
Jarret, Michael; Jordan, Stephen P.; Lackey, Brad
2016-10-01
Most experimental and theoretical studies of adiabatic optimization use stoquastic Hamiltonians, whose ground states are expressible using only real nonnegative amplitudes. This raises a question as to whether classical Monte Carlo methods can simulate stoquastic adiabatic algorithms with polynomial overhead. Here we analyze diffusion Monte Carlo algorithms. We argue that, based on differences between L1 and L2 normalized states, these algorithms suffer from certain obstructions preventing them from efficiently simulating stoquastic adiabatic evolution in generality. In practice however, we obtain good performance by introducing a method that we call Substochastic Monte Carlo. In fact, our simulations are good classical optimization algorithms in their own right, competitive with the best previously known heuristic solvers for MAX-k -SAT at k =2 ,3 ,4 .
Self-learning Monte Carlo method
Liu, Junwei; Qi, Yang; Meng, Zi Yang; Fu, Liang
2017-01-01
Monte Carlo simulation is an unbiased numerical tool for studying classical and quantum many-body systems. One of its bottlenecks is the lack of a general and efficient update algorithm for large size systems close to the phase transition, for which local updates perform badly. In this Rapid Communication, we propose a general-purpose Monte Carlo method, dubbed self-learning Monte Carlo (SLMC), in which an efficient update algorithm is first learned from the training data generated in trial simulations and then used to speed up the actual simulation. We demonstrate the efficiency of SLMC in a spin model at the phase transition point, achieving a 10-20 times speedup.
Monte Carlo strategies in scientific computing
Liu, Jun S
2008-01-01
This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...
Cross-training and periodization in running.
Brennan, D K; Wilder, R P
1996-01-01
Understanding the principles of cross-training and periodization will assist the coach and team physician in designing training programs that maximize performance while minimizing risk of injury. Cross-training is defined as simultaneous training for two or more sports or the use of multiple modes of training to enhance performance in one particular sport. This manuscript will review the benefits of three commonly used forms of cross training, deep water running, cycling and swimming, on running, training and performance. Periodization refers to the process of designing a progressive and appropriate training plan in order to optimize performance, yet minimize injury related to overtraining. The main structural components for periodization are macrocycles, mesocycles and microcycles. Physiological determinants for distance running performance, including VO2 max, lactate threshold and running economy, are presented as key components for the design of endurance training programs. Training intensity can be prescribed or monitored using running speed, heart rate, and rating of perceived exertion (RPE). The clinician must often make recommendations regarding the appropriate level of training or offer an alternative. By understanding the principles of cross-training and periodization, the clinician can assist the coach or athlete in preventing injury as well as assisting the attainment of peak performance.
Institute of Scientific and Technical Information of China (English)
李剑慧; 臧斌宇; 吴蓉; 朱传琪
2002-01-01
Parallelizing compilers have made great progress in recent years. However, there still remains a gap between the current ability of parallelizing compilers and their final goals.In order to achieve the maximum parallelism, run-time techniques were used in parallelizing compilers during last few years. First, this paper presents a basic run-time privatization method.The definition of run-time dead code is given and its side effect is discussed. To eliminate the imprecision caused by the run-time dead code, backward data-flow information must be used.Proteus Test, which can use backward information in run-time, is then presented to exploit more dynamic parallelism. Also, a variation of Proteus Test, the Advanced Proteus Test, is offered to achieve partial parallelism. Proteus Test was implemented on the parallelizing compiler AFT.In the end of this paper the program fpppp.f of Spec95fp Benchmark is taken as an example, to show the effectiveness of Proteus Test.
The Run-2 ATLAS Trigger System
Ruiz Martínez, A.; ATLAS Collaboration
2016-10-01
The ATLAS trigger successfully collected collision data during the first run of the LHC between 2009-2013 at different centre-of-mass energies between 900 GeV and 8TeV. The trigger system consists of a hardware Level-1 and a software-based high level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV and higher luminosity, resulting in up to five times higher rates of processes of interest. A brief review of the ATLAS trigger system upgrades that were implemented between Run-1 and Run-2, allowing to cope with the increased trigger rates while maintaining or even improving the efficiency to select physics processes of interest, will be given. This includes changes to the Level-1 calorimeter and muon trigger systems, the introduction of a new Level-1 topological trigger module and the merging of the previously two-level HLT system into a single event processing farm. A few examples will be shown, such as the impressive performance improvements in the HLT trigger algorithms used to identify leptons, hadrons and global event quantities like missing transverse energy. Finally, the status of the commissioning of the trigger system and its performance during the 2015 run will be presented.
Exercise economy in skiing and running
Directory of Open Access Journals (Sweden)
Thomas eLosnegard
2014-01-01
Full Text Available Substantial inter-individual variations in exercise economy exist even in highly trained endurance athletes. The variation is believed to be determined partly by intrinsic factors. Therefore, in the present study, we compared exercise economy in V2-skating, double poling and uphill running. Ten highly trained male cross-country skiers (23 ± 3 years, 180 ± 6 cm, 75 ± 8 kg, VO2peak running: 76.3 ± 5.6 mL•kg-1•min-1 participated in the study. Exercise economy and VO2peak during treadmill running, ski skating (V2 technique and double poling were compared based on correlation analysis with subsequent criteria for interpreting the magnitude of correlation (r. There was a very large correlation in exercise economy between V2-skating and double poling (r = 0.81 and a large correlation between V2-skating and running (r = 0.53 and double poling and running (r = 0.58. There were trivial to moderate correlations between exercise economy and VO2peak (r = 0.00-0.23, cycle rate (r = 0.03-0.46, body mass (r = -0.09-0.46 and body height (r = 0.11-0.36. In conclusion, the inter-individual variation in exercise economy could only moderately be explained by differences in VO2peak, body mass and body height and therefore we suggest that other intrinsic factors contribute to the variation in exercise economy between highly trained subjects.
Exercise economy in skiing and running.
Losnegard, Thomas; Schäfer, Daniela; Hallén, Jostein
2014-01-01
Substantial inter-individual variations in exercise economy exist even in highly trained endurance athletes. The variation is believed to be determined partly by intrinsic factors. Therefore, in the present study, we compared exercise economy in V2-skating, double poling, and uphill running. Ten highly trained male cross-country skiers (23 ± 3 years, 180 ± 6 cm, 75 ± 8 kg, VO2peak running: 76.3 ± 5.6 mL·kg(-1)·min(-1)) participated in the study. Exercise economy and VO2peak during treadmill running, ski skating (V2 technique) and double poling were compared based on correlation analysis. There was a very large correlation in exercise economy between V2-skating and double poling (r = 0.81) and large correlations between V2-skating and running (r = 0.53) and double poling and running (r = 0.58). There were trivial to moderate correlations between exercise economy and the intrinsic factors VO2peak (r = 0.00-0.23), cycle rate (r = 0.03-0.46), body mass (r = -0.09-0.46) and body height (r = 0.11-0.36). In conclusion, the inter-individual variation in exercise economy could be explained only moderately by differences in VO2peak, body mass and body height. Apparently other intrinsic factors contribute to the variation in exercise economy between highly trained subjects.
Muon and B-physics trigger of the ATLAS experiment in Run 2
Smirnova, L. N.; Turchikhin, S. M.
2017-09-01
This paper presents an overview of muon trigger and B-physics trigger of the ATLAS experiment. Main updates done during the preparation for Run 2 data-taking are outlined. Trigger performance results obtained with new experimental data and modelling are shown.
RHIC performance for FY2011 Au+Au heavy ion run
Energy Technology Data Exchange (ETDEWEB)
Marr, G.; Ahrens, L.; Bai, M.; Beebe-Wang, J.; Blackler, I.; Blaskiewicz, M.; Brennan, J.M.; Brown, K.A.; Bruno, D.; Butler, J.; Carlson, C.; Connolly, R.; D' Ottavio, T.; Drees, K.A.; Fedotov, A.V.; Fischer, W.; Fu, W.; Gardner, C.J.; Gassner, D.M.; Glenn, J.W.; Gu, X.; Harvey, M.; Hayes, T.; Hoff, L.; Huang, H.; Ingrassia, P.F.; Jamilkowski, J.P.; Kling, N.; Lafky, M.; Laster, J.S.; Liu, C.; Luo, Y.; Mapes, M.; Marusic, A.; Mernick, K.; Michnoff, R.J.; Minty, M.G.; Montag, C.; Morris, J.; Naylor, C.; Nemesure, S.; Polizzo, S.; Ptitsyn, V.; Robert-Demolaize, G.; Roser, T.; Sampson, P.; Sandberg, J.; Schoefer, V.; Schultheiss, C.; Severino, F.; Shrey, T.; Smith, K.; Steski, D.; Tepikian, S.; Thieberger, P.; Trbojevic, D.; Tsoupas, N.; Tuozzolo, J.E.; VanKuik, B.; Wang, G.; Wilinski, M.; Zaltsman, A.; Zeno, K.; Zhang, S.Y.
2011-09-04
Following the Fiscal Year (FY) 2010 (Run-10) Relativistic Heavy Ion Collider (RHIC) Au+Au run, RHIC experiment upgrades sought to improve detector capabilities. In turn, accelerator improvements were made to improve the luminosity available to the experiments for this run (Run-11). These improvements included: a redesign of the stochastic cooling systems for improved reliability; a relocation of 'common' RF cavities to alleviate intensity limits due to beam loading; and an improved usage of feedback systems to control orbit, tune and coupling during energy ramps as well as while colliding at top energy. We present an overview of changes to the Collider and review the performance of the collider with respect to instantaneous and integrated luminosity goals. At the conclusion of the FY 2011 polarized proton run, preparations for heavy ion run proceeded on April 18, with Au+Au collisions continuing through June 28. Our standard operations at 100 GeV/nucleon beam energy was bracketed by two shorter periods of collisions at lower energies (9.8 and 13.5 GeV/nucleon), continuing a previously established program of low and medium energy runs. Table 1 summarizes our history of heavy ion operations at RHIC.
Parallel Markov chain Monte Carlo simulations.
Ren, Ruichao; Orkoulas, G
2007-06-07
With strict detailed balance, parallel Monte Carlo simulation through domain decomposition cannot be validated with conventional Markov chain theory, which describes an intrinsically serial stochastic process. In this work, the parallel version of Markov chain theory and its role in accelerating Monte Carlo simulations via cluster computing is explored. It is shown that sequential updating is the key to improving efficiency in parallel simulations through domain decomposition. A parallel scheme is proposed to reduce interprocessor communication or synchronization, which slows down parallel simulation with increasing number of processors. Parallel simulation results for the two-dimensional lattice gas model show substantial reduction of simulation time for systems of moderate and large size.
Monte Carlo Hamiltonian：Linear Potentials
Institute of Scientific and Technical Information of China (English)
LUOXiang－Qian; HelmutKROEGER; 等
2002-01-01
We further study the validity of the Monte Carlo Hamiltonian method .The advantage of the method,in comparison with the standard Monte Carlo Lagrangian approach,is its capability to study the excited states.We consider two quantum mechanical models:a symmetric one V(x)=/x/2;and an asymmetric one V(x)==∞,for x<0 and V(x)=2,for x≥0.The results for the spectrum,wave functions and thermodynamical observables are in agreement with the analytical or Runge-Kutta calculations.
Monte Carlo dose distributions for radiosurgery
Energy Technology Data Exchange (ETDEWEB)
Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica; Sanchez-Doblado, F. [Sevilla Univ. (Spain). Dept. Fisiologia Medica y Biofisica]|[Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Nunez, L. [Clinica Puerta de Hierro, Madrid (Spain). Servicio de Radiofisica; Arrans, R.; Sanchez-Calzado, J.A.; Errazquin, L. [Hospital Univ. Virgen Macarena, Sevilla (Spain). Servicio de Oncologia Radioterapica; Sanchez-Nieto, B. [Royal Marsden NHS Trust (United Kingdom). Joint Dept. of Physics]|[Inst. of Cancer Research, Sutton, Surrey (United Kingdom)
2001-07-01
The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)
Monte carlo simulations of organic photovoltaics.
Groves, Chris; Greenham, Neil C
2014-01-01
Monte Carlo simulations are a valuable tool to model the generation, separation, and collection of charges in organic photovoltaics where charges move by hopping in a complex nanostructure and Coulomb interactions between charge carriers are important. We review the Monte Carlo techniques that have been applied to this problem, and describe the results of simulations of the various recombination processes that limit device performance. We show how these processes are influenced by the local physical and energetic structure of the material, providing information that is useful for design of efficient photovoltaic systems.
Monte Carlo simulation of neutron scattering instruments
Energy Technology Data Exchange (ETDEWEB)
Seeger, P.A.
1995-12-31
A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width.
The Rational Hybrid Monte Carlo Algorithm
Clark, M A
2006-01-01
The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.
The Rational Hybrid Monte Carlo algorithm
Clark, Michael
2006-12-01
The past few years have seen considerable progress in algorithmic development for the generation of gauge fields including the effects of dynamical fermions. The Rational Hybrid Monte Carlo (RHMC) algorithm, where Hybrid Monte Carlo is performed using a rational approximation in place the usual inverse quark matrix kernel is one of these developments. This algorithm has been found to be extremely beneficial in many areas of lattice QCD (chiral fermions, finite temperature, Wilson fermions etc.). We review the algorithm and some of these benefits, and we compare against other recent algorithm developements. We conclude with an update of the Berlin wall plot comparing costs of all popular fermion formulations.
Jefferson Lab Data Acquisition Run Control System
Energy Technology Data Exchange (ETDEWEB)
Vardan Gyurjyan; Carl Timmer; David Abbott; William Heyes; Edward Jastrzembski; David Lawrence; Elliott Wolin
2004-10-01
A general overview of the Jefferson Lab data acquisition run control system is presented. This run control system is designed to operate the configuration, control, and monitoring of all Jefferson Lab experiments. It controls data-taking activities by coordinating the operation of DAQ sub-systems, online software components and third-party software such as external slow control systems. The main, unique feature which sets this system apart from conventional systems is its incorporation of intelligent agent concepts. Intelligent agents are autonomous programs which interact with each other through certain protocols on a peer-to-peer level. In this case, the protocols and standards used come from the domain-independent Foundation for Intelligent Physical Agents (FIPA), and the implementation used is the Java Agent Development Framework (JADE). A lightweight, XML/RDF-based language was developed to standardize the description of the run control system for configuration purposes.
Energy Technology Data Exchange (ETDEWEB)
Bailey, S.
2001-03-08
The CDF Run I B physics program has been very successful, making numerous measurements over a wide variety of B physics topics. Measurements have included masses and lifetimes; discovery of the B{sub c}; B{sub s} {r_arrow} J/{psi}{phi} polarization; B{sup 0} {leftrightarrow} {bar B}{sup 0} mixing; sin (2{beta}); and rare decay limits. Recent results include a search for {Lambda}{sub b} {r_arrow} {Lambda}{gamma} and a study of B{sup 0} {r_arrow} J/{psi}K(*){sup 0} {pi}{sup +}{pi}{sup {minus}} decays. The tools and experience developed during Run I are quite valuable as CDF enters Run II.
Applying graphics processor units to Monte Carlo dose calculation in radiation therapy
Directory of Open Access Journals (Sweden)
Bakhtiari M
2010-01-01
Full Text Available We investigate the potential in using of using a graphics processor unit (GPU for Monte-Carlo (MC-based radiation dose calculations. The percent depth dose (PDD of photons in a medium with known absorption and scattering coefficients is computed using a MC simulation running on both a standard CPU and a GPU. We demonstrate that the GPU′s capability for massive parallel processing provides a significant acceleration in the MC calculation, and offers a significant advantage for distributed stochastic simulations on a single computer. Harnessing this potential of GPUs will help in the early adoption of MC for routine planning in a clinical environment.
Sharma, Anupam; Long, Lyle N.
2004-10-01
A particle approach using the Direct Simulation Monte Carlo (DSMC) method is used to solve the problem of blast impact with structures. A novel approach to model the solid boundary condition for particle methods is presented. The solver is validated against an analytical solution of the Riemann shocktube problem and against experiments on interaction of a planar shock with a square cavity. Blast impact simulations are performed for two model shapes, a box and an I-shaped beam, assuming that the solid body does not deform. The solver uses domain decomposition technique to run in parallel. The parallel performance of the solver on two Beowulf clusters is also presented.
Testing Lorentz Invariance Emergence in the Ising Model using Monte Carlo simulations
Dias Astros, Maria Isabel
2017-01-01
In the context of the Lorentz invariance as an emergent phenomenon at low energy scales to study quantum gravity a system composed by two 3D interacting Ising models (one with an anisotropy in one direction) was proposed. Two Monte Carlo simulations were run: one for the 2D Ising model and one for the target model. In both cases the observables (energy, magnetization, heat capacity and magnetic susceptibility) were computed for different lattice sizes and a Binder cumulant introduced in order to estimate the critical temperature of the systems. Moreover, the correlation function was calculated for the 2D Ising model.
Tests of General relativity with planetary orbits and Monte Carlo simulations
Fienga, A; Exertier, P; Manche, H; Gastineau, M
2014-01-01
Based on the new developped planetary ephemerides INPOP13c, determinations of acceptable intervals of General Relativity violation in considering simultaneously the PPN parameters $\\beta$, PPN $\\gamma$, the flattening of the sun $J_{2}^\\odot$ and time variation of the gravitational mass of the sun $\\mu$ are obtained in using Monte Carlo simulation coupled with basic genetic algorithm. Possible time variations of the gravitational constant G are also deduced. Discussions are lead about the better choice of indicators for the goodness-of-fit for each run and limits consistent with general relativity are obtained simultaneously.
Fast sequential Monte Carlo methods for counting and optimization
Rubinstein, Reuven Y; Vaisman, Radislav
2013-01-01
A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the
Footwear Decreases Gait Asymmetry during Running.
Directory of Open Access Journals (Sweden)
Stefan Hoerzer
Full Text Available Previous research on elderly people has suggested that footwear may improve neuromuscular control of motion. If footwear does in fact improve neuromuscular control, then such an influence might already be present in young, healthy adults. A feature that is often used to assess neuromuscular control of motion is the level of gait asymmetry. The objectives of the study were (a to develop a comprehensive asymmetry index (CAI that is capable of detecting gait asymmetry changes caused by external boundary conditions such as footwear, and (b to use the CAI to investigate whether footwear influences gait asymmetry during running in a healthy, young cohort. Kinematic and kinetic data were collected for both legs of 15 subjects performing five barefoot and five shod over-ground running trials. Thirty continuous gait variables including ground reaction forces and variables of the hip, knee, and ankle joints were computed for each leg. For each individual, the differences between the variables for the right and left leg were calculated. Using this data, a principal component analysis was conducted to obtain the CAI. This study had two main outcomes. First, a sensitivity analysis suggested that the CAI had an improved sensitivity for detecting changes in gait asymmetry caused by external boundary conditions. The CAI may, therefore, have important clinical applications such as monitoring the progress of neuromuscular diseases (e.g. stroke or cerebral palsy. Second, the mean CAI for shod running (131.2 ± 48.5; mean ± standard deviation was significantly lower (p = 0.041 than the CAI for barefoot running (155.7 ± 39.5. This finding suggests that in healthy, young adults gait asymmetry is reduced when running in shoes compared to running barefoot, which may be a result of improved neuromuscular control caused by changes in the afferent sensory feedback.
Health related aspects of PA & sport/running
Dr. Johan de Jong
2015-01-01
The lecture presents an overview of the positive but also the negative health related aspects of running. An deeper insight will be offered when it comes to running, especially the mass running events.
The design of the run Clever randomized trial
DEFF Research Database (Denmark)
Ramskov, Daniel; Nielsen, Rasmus Oestergaard; Sørensen, Henrik
2016-01-01
BACKGROUND: Injury incidence and prevalence in running populations have been investigated and documented in several studies. However, knowledge about injury etiology and prevention is needed. Training errors in running are modifiable risk factors and people engaged in recreational running need...... evidence-based running schedules to minimize the risk of injury. The existing literature on running volume and running intensity and the development of injuries show conflicting results. This may be related to previously applied study designs, methods used to quantify the performed running...... and the statistical analysis of the collected data. The aim of the Run Clever trial is to investigate if a focus on running intensity compared with a focus on running volume in a running schedule influences the overall injury risk differently. METHODS/DESIGN: The Run Clever trial is a randomized trial with a 24-week...
Running vacuum versus the $\\Lambda$CDM
Gómez-Valent, Adrià; Pérez, Javier de Cruz
2016-01-01
It is well-known that a constant $\\Lambda$-term is a traditional building block of the concordance $\\Lambda$CDM model. We show that this assumption is not necessarily the optimal one from the phenomenological point of view. The class of running vacuum models, with a possible running of the gravitational coupling G, are capable to fit the overall cosmological data SNIa+BAO+H(z)+LSS+BBN+CMB better than the $\\Lambda$CDM, namely at a level of $\\sim 3\\sigma$ and with Akaike and Bayesian information criteria supporting a strong level of statistical evidence on this fact. Here we report on the results of such analysis.
Abort Gap Cleaning for LHC Run 2
Energy Technology Data Exchange (ETDEWEB)
Uythoven, Jan [CERN; Boccardi, Andrea [CERN; Bravin, Enrico [CERN; Goddard, Brennan [CERN; Hemelsoet, Georges-Henry [CERN; Höfle, Wolfgang [CERN; Jacquet, Delphine [CERN; Kain, Verena [CERN; Mazzoni, Stefano [CERN; Meddahi, Malika [CERN; Valuch, Daniel [CERN; Gianfelice-Wendt, Eliana [Fermilab
2014-07-01
To minimize the beam losses at the moment of an LHC beam dump the 3 μs long abort gap should contain as few particles as possible. Its population can be minimised by abort gap cleaning using the LHC transverse damper system. The LHC Run 1 experience is briefly recalled; changes foreseen for the LHC Run 2 are presented. They include improvements in the observation of the abort gap population and the mechanism to decide if cleaning is required, changes to the hardware of the transverse dampers to reduce the detrimental effect on the luminosity lifetime and proposed changes to the applied cleaning algorithms.
Chaotic inflation with curvaton induced running
DEFF Research Database (Denmark)
Sloth, Martin Snoager
2014-01-01
While dust contamination now appears as a likely explanation of the apparent tension between the recent BICEP2 data and the Planck data, we will here explore the consequences of a large running in the spectral index as suggested by the BICEP2 collaboration as an alternative explanation...... of the apparent tension, but which would be in conflict with prediction of the simplest model of chaotic inflation. The large field chaotic model is sensitive to UV physics, and the nontrivial running of the spectral index suggested by the BICEP2 collaboration could therefore, if true, be telling us some...
Abort Gap Cleaning for LHC Run 2
Uythoven, J; Bravin, E; Goddard, B; Hemelsoet, GH; Höfle, W; Jacquet, D; Kain, V; Mazzoni, S; Meddahi, M; Valuch, D
2015-01-01
To minimise the beam losses at the moment of an LHC beam dump the 3 μs long abort gap should contain as few particles as possible. Its population can be minimised by abort gap cleaning using the LHC transverse damper system. The LHC Run 1 experience is briefly recalled; changes foreseen for the LHC Run 2 are presented. They include improvements in the observation of the abort gap population and the mechanism to decide if cleaning is required, changes to the hardware of the transverse dampers to reduce the detrimental effect on the luminosity lifetime and proposed changes to the applied cleaning algorithms.
Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules
Lester, William A; Reynolds, PJ
1994-01-01
This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n
Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia
Energy Technology Data Exchange (ETDEWEB)
Granero Cabanero, D.
2015-07-01
The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)
On the use of stochastic approximation Monte Carlo for Monte Carlo integration
Liang, Faming
2009-03-01
The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Morillon, B.
1996-12-31
With most of the traditional and contemporary techniques, it is still impossible to solve the transport equation if one takes into account a fully detailed geometry and if one studies precisely the interactions between particles and matters. Only the Monte Carlo method offers such a possibility. However with significant attenuation, the natural simulation remains inefficient: it becomes necessary to use biasing techniques where the solution of the adjoint transport equation is essential. The Monte Carlo code Tripoli has been using such techniques successfully for a long time with different approximate adjoint solutions: these methods require from the user to find out some parameters. If this parameters are not optimal or nearly optimal, the biases simulations may bring about small figures of merit. This paper presents a description of the most important biasing techniques of the Monte Carlo code Tripoli ; then we show how to calculate the importance function for general geometry with multigroup cases. We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We compare different biased simulations with the importance function calculated by collision probabilities for one-group and multigroup problems. We have run simulations with new biasing method for one-group transport problems with isotropic shocks and for multigroup problems with anisotropic shocks. The results show that for the one-group and homogeneous geometry transport problems the method is quite optimal without splitting and russian roulette technique but for the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add splitting and russian roulette technique.
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
A novel bisupporter bimetal catalyst PVP-PdCl2-SnCl4/MontK10-PEG400, using for dehalogenation of insoluable aromatic halides in aqueous system, has shown high dechlorination activity and selectivity, without any organic solvent or phase transfer catalyst. The conversion of aromatic chlorides can reach 100%. The catalyst is easy to prepare and has good reusability.
Monte Carlo simulation of the γγ → τ+τ− process in e+e− collisions
Filipovic, Jelena
2017-01-01
Monte Carlo events for γγ → τ+τ− process in e+e− collisions were generated with SuperChic and the cross-sections at different centre-of-mass energies of FCC-ee were obtained. Further tau decays were performed using Pythia8 generator. Events were stored in Root trees and prepared for future analysis.
Oxygen delivery does not limit peak running speed during incremental downhill running to exhaustion.
Liefeldt, G; Noakes, T D; Dennis, S C
1992-01-01
Oxygen consumption (VO2), ventilation (VI), respiratory exchange ratio (R), stride frequency and blood lactate concentrations were measured continuously in nine trained athletes during two continuous incremental treadmill runs to exhaustion on gradients of either 0 degree or -3 degrees. Compared to the run at 0 degree gradient, the athletes reached significantly higher maximal treadmill velocities but significantly lower VO2, VI, R and peak blood lactate concentrations (P less than 0.001) during downhill running. These lower VO2 and blood lactate concentrations at exhaustion indicated that factors other than oxygen delivery limited maximal performance during the downhill run. In contrast, stride frequencies were similar at each treadmill velocity; the higher maximal speed during the downhill run was achieved with a significantly longer stride length (P less than 0.001); maximal stride frequency was the same between tests. Equivalent maximal stride frequencies suggested that factors determining the rate of lower limb stride recovery may have limited maximal running speed during downhill running and, possibly, also during horizontal running.
Short-run and long-run dynamics of farm land allocation
DEFF Research Database (Denmark)
Arnberg, Søren; Hansen, Lars Gårn
2012-01-01
that include acreage, output, and variable input utilization at the crop level. Results indicate that there are substantial differences between the short-run and long-run land allocation behaviour of Danish farmers and that there are substantial differences in the time lags associated with different crops...
Weekly running volume and risk of running-related injuries among marathon runners
DEFF Research Database (Denmark)
Rasmussen, Christina Haugaard; Nielsen, Rasmus Østergaard; Juul, Martin Serup
2013-01-01
PURPOSEBACKGROUND: The purpose of this study was to investigate if the risk of injury declines with increasing weekly running volume before a marathon race.......PURPOSEBACKGROUND: The purpose of this study was to investigate if the risk of injury declines with increasing weekly running volume before a marathon race....
Weekly running volume and risk of running-related injuries among marathon runners
DEFF Research Database (Denmark)
Rasmussen, Christina Haugaard; Nielsen, R.O.; Juul, Martin Serup
2013-01-01
The purpose of this study was to investigate if the risk of injury declines with increasing weekly running volume before a marathon race.......The purpose of this study was to investigate if the risk of injury declines with increasing weekly running volume before a marathon race....
Sex differences in running mechanics and patellofemoral joint kinetics following an exhaustive run.
Willson, John D; Loss, Justin R; Willy, Richard W; Meardon, Stacey A
2015-11-26
Patellofemoral joint pain (PFP) is a common running-related injury that is more prevalent in females and thought to be associated with altered running mechanics. Changes in running mechanics have been observed following an exhaustive run but have not been analyzed relative to the sex bias for PFP. The purpose of this study was to test if females demonstrate unique changes in running mechanics associated with PFP following an exhaustive run. For this study, 18 females and 17 males ran to volitional exhaustion. Peak PFJ contact force and stress, PFJ contact force and stress loading rates, hip adduction excursion, and hip and knee joint frontal plane angular impulse were analyzed between females and males using separate 2 factor ANOVAs (2 (male/female)×2 (before/after exhaustion)). We observed similar changes in running mechanics among males and females over the course of the exhaustive run. Specifically, greater peak PFJ contact force loading rate (5%, P=.01), PFJ stress loading rate (5%, Pmechanics due to exhaustion do not appear to contribute to the sex bias for PFP.
Weekly running volume and risk of running-related injuries among marathon runners
DEFF Research Database (Denmark)
Rasmussen, Christina Haugaard; Nielsen, Rasmus Østergaard; Juul, Martin Serup;
2013-01-01
PURPOSEBACKGROUND: The purpose of this study was to investigate if the risk of injury declines with increasing weekly running volume before a marathon race.......PURPOSEBACKGROUND: The purpose of this study was to investigate if the risk of injury declines with increasing weekly running volume before a marathon race....
Comparison of fractions of inactive modules between Run1 and Run2
Motohashi, Kazuki; The ATLAS collaboration
2015-01-01
Fraction of inactive modules for each component of the ATLAS pixel detector at the end of Run 1 and the beginning of Run 2. A similar plot which uses a result of functionality tests during LS1 can be found in ATL-INDET-SLIDE-2014-388.
Split-phase motor running as capacitor starts motor and as capacitor run motor
Directory of Open Access Journals (Sweden)
Yahaya Asizehi ENESI
2016-07-01
Full Text Available In this paper, the input parameters of a single phase split-phase induction motor is taken to investigate and to study the output performance characteristics of capacitor start and capacitor run induction motor. The value of these input parameters are used in the design characteristics of capacitor run and capacitor start motor with each motor connected to rated or standard capacitor in series with auxiliary winding or starting winding respectively for the normal operational condition. The magnitude of capacitor that will develop maximum torque in capacitor start motor and capacitor run motor are investigated and determined by simulation. Each of these capacitors is connected to the auxiliary winding of split-phase motor thereby transforming it into capacitor start or capacitor run motor. The starting current and starting torque of the split-phase motor (SPM, capacitor run motor (CRM and capacitor star motor (CSM are compared for their suitability in their operational performance and applications.
Towards scalable parellelism in Monte Carlo particle transport codes using remote memory access
Energy Technology Data Exchange (ETDEWEB)
Romano, Paul K [Los Alamos National Laboratory; Brown, Forrest B [Los Alamos National Laboratory; Forget, Benoit [MIT
2010-01-01
One forthcoming challenge in the area of high-performance computing is having the ability to run large-scale problems while coping with less memory per compute node. In this work, they investigate a novel data decomposition method that would allow Monte Carlo transport calculations to be performed on systems with limited memory per compute node. In this method, each compute node remotely retrieves a small set of geometry and cross-section data as needed and remotely accumulates local tallies when crossing the boundary of the local spatial domain. initial results demonstrate that while the method does allow large problems to be run in a memory-limited environment, achieving scalability may be difficult due to inefficiencies in the current implementation of RMA operations.
A comparison of Monte Carlo generators
Golan, Tomasz
2014-01-01
A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.
Monte Carlo Tools for Jet Quenching
Zapp, Korinna
2011-01-01
A thorough understanding of jet quenching on the basis of multi-particle final states and jet observables requires new theoretical tools. This talk summarises the status and propects of the theoretical description of jet quenching in terms of Monte Carlo generators.
An Introduction to Monte Carlo Methods
Raeside, D. E.
1974-01-01
Reviews the principles of Monte Carlo calculation and random number generation in an attempt to introduce the direct and the rejection method of sampling techniques as well as the variance-reduction procedures. Indicates that the increasing availability of computers makes it possible for a wider audience to learn about these powerful methods. (CC)
Variance Reduction Techniques in Monte Carlo Methods
Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.
2010-01-01
Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the intr
Monte Carlo methods beyond detailed balance
Schram, Raoul D.; Barkema, Gerard T.
2015-01-01
Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying
Variance Reduction Techniques in Monte Carlo Methods
Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.
2010-01-01
Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the intr
An analysis of Monte Carlo tree search
CSIR Research Space (South Africa)
James, S
2017-02-01
Full Text Available Monte Carlo Tree Search (MCTS) is a family of directed search algorithms that has gained widespread attention in recent years. Despite the vast amount of research into MCTS, the effect of modifications on the algorithm, as well as the manner...
Monte Carlo Simulation of Counting Experiments.
Ogden, Philip M.
A computer program to perform a Monte Carlo simulation of counting experiments was written. The program was based on a mathematical derivation which started with counts in a time interval. The time interval was subdivided to form a binomial distribution with no two counts in the same subinterval. Then the number of subintervals was extended to…
The effect of footwear on running performance and running economy in distance runners.
Fuller, Joel T; Bellenger, Clint R; Thewlis, Dominic; Tsiros, Margarita D; Buckley, Jonathan D
2015-03-01
The effect of footwear on running economy has been investigated in numerous studies. However, no systematic review and meta-analysis has synthesised the available literature and the effect of footwear on running performance is not known. The aim of this systematic review and meta-analysis was to investigate the effect of footwear on running performance and running economy in distance runners, by reviewing controlled trials that compare different footwear conditions or compare footwear with barefoot. The Web of Science, Scopus, MEDLINE, CENTRAL (Cochrane Central Register of Controlled Trials), EMBASE, AMED (Allied and Complementary Medicine), CINAHL and SPORTDiscus databases were searched from inception up until April 2014. Included articles reported on controlled trials that examined the effects of footwear or footwear characteristics (including shoe mass, cushioning, motion control, longitudinal bending stiffness, midsole viscoelasticity, drop height and comfort) on running performance or running economy and were published in a peer-reviewed journal. Of the 1,044 records retrieved, 19 studies were included in the systematic review and 14 studies were included in the meta-analysis. No studies were identified that reported effects on running performance. Individual studies reported significant, but trivial, beneficial effects on running economy for comfortable and stiff-soled shoes [standardised mean difference (SMD) economy for cushioned shoes (SMD = 0.37; P economy for training in minimalist shoes (SMD = 0.79; P economy for light shoes and barefoot compared with heavy shoes (SMD economy. Certain models of footwear and footwear characteristics can improve running economy. Future research in footwear performance should include measures of running performance.
Numerical Modelling of Wave Run-Up
DEFF Research Database (Denmark)
Ramirez, Jorge Robert Rodriguez; Frigaard, Peter; Andersen, Thomas Lykke;
2011-01-01
Wave loads are important in problems related to offshore structure, such as wave run-up, slamming. The computation of such wave problems are carried out by CFD models. This paper presents one model, NS3, which solve 3D Navier-Stokes equations and use Volume of Fluid (VOF) method to treat the free...
The Beautiful Physics of LHC Run 2
AUTHOR|(CDS)2108556
2015-01-01
Run 2 of the LHC offers some beautiful prospects for new physics, including flavour physics as well as more detailed studies of the Higgs boson and searches for new physics beyond the Standard Model (BSM). One of the possibilities for BSM physics is supersymmetry, and flavour physics plays various important r\\^oles in constraining supersymmetric models.
EMBL rescue package keeps bioinformatics centre running
Abott, A
1999-01-01
The threat to the EBI arising from the EC refusal to fund its running costs seems to have been temporarily lifted. At a meeting in EMBL, Heidelberg, delegates agreed in principle to make up the shortfall of 5 million euros. A final decision will be taken at a special meeting of the EMBL council in March (1 page).
BEAM SCRUBBING FOR RHIC POLARIZED PROTON RUN.
Energy Technology Data Exchange (ETDEWEB)
ZHANG,S.Y.FISCHER,W.HUANG,H.ROSER,T.
2004-07-05
One of the intensity limiting factor of RHIC polarized proton beam is the electron cloud induced pressure rise. A beam scrubbing study shows that with a reasonable period of time of running high intensity 112-bunch proton beam, the pressure rise can be reduced, allowing higher beam intensity.
Numerical Modelling of Wave Run-Up
DEFF Research Database (Denmark)
Ramirez, Jorge Robert Rodriguez; Frigaard, Peter; Andersen, Thomas Lykke
2011-01-01
Wave loads are important in problems related to offshore structure, such as wave run-up, slamming. The computation of such wave problems are carried out by CFD models. This paper presents one model, NS3, which solve 3D Navier-Stokes equations and use Volume of Fluid (VOF) method to treat the free...
Common Running Overuse Injuries and Prevention
Directory of Open Access Journals (Sweden)
Žiga Kozinc
2017-09-01
Full Text Available Runners are particularly prone to developing overuse injuries. The most common running-related injuries include medial tibial stress syndrome, Achilles tendinopathy, plantar fasciitis, patellar tendinopathy, iliotibial band syndrome, tibial stress fractures, and patellofemoral pain syndrome. Two of the most significant risk factors appear to be injury history and weekly distance. Several trials have successfully identified biomechanical risk factors for specific injuries, with increased ground reaction forces, excessive foot pronation, hip internal rotation and hip adduction during stance phase being mentioned most often. However, evidence on interventions for lowering injury risk is limited, especially regarding exercise-based interventions. Biofeedback training for lowering ground reaction forces is one of the few methods proven to be effective. It seems that the best way to approach running injury prevention is through individualized treatment. Each athlete should be assessed separately and scanned for risk factors, which should be then addressed with specific exercises. This review provides an overview of most common running-related injuries, with a particular focus on risk factors, and emphasizes the problems encountered in preventing running-related injuries.
Book Review: HTML5: Up and Running
Directory of Open Access Journals (Sweden)
Mark Cyzyk
2011-04-01
Full Text Available Mark Pilgrim's HTML5: Up and Running was one of the first books published on the subject. If you’re looking for a really good, well-written, entertaining, concise overview of what’s going on right this very minute with HTML5 technologies and techniques, this is a good book to have.
Considerations in Running a Foreign Language University
Institute of Scientific and Technical Information of China (English)
陈乃芳
2005-01-01
To run a foreign language university well, four important things should be given priority: 1)pay constant attention to teacher education; 2) make sure the staff keep abreast of the latest teaching beliefs;3) back up teaching with high quality research; 4) do a good job in cultural and humanity education.
Event alignment, warping between running speeds
DEFF Research Database (Denmark)
Pontoppidan, Niels Henrik; Douglas, Ryan
2004-01-01
marine conditions (different load settings on the propeller curve) was in the range from 60 to 120 rotations per minute; furthermore the running speed was stable within periods of fixed load. Electronically controlled engines can change the angular timing of certain events, such as fuel injection...
The CDF Run II Disk Inventory Manager
Institute of Scientific and Technical Information of China (English)
PaulHubbard; StephanLammel
2001-01-01
The Collider Detector at Fermilab(CDF) experiment records and analyses proton-antiprotion interactions at a center-of -mass energy of 2 TeV,Run II of the Fermilab Tevatron started in April of this year,The duration of the run is expected to be over two years.One of the main data handling strategies of CDF for RUn II is to hide all tape access from the user and to facilitate sharing of data and thus disk space,A disk inventory manager was designed and developed over the past years to keep track of the data on disk.to coordinate user access to the data,and to stage data back from tape to disk as needed.The CDF Run II disk inventory manager consists of a server process,a user and administrator command line interfaces.and a library with the routines of the client API.Data are managed in filesets which are groups of one or more files.The system keeps track of user acess to the filesets and attempts to keep frequently accessed data on disk.Data that are not on disk are automatically staged back from tape as needed.For CDF the main staging method is based on the mt-tools package as tapes are written according to the ANSI standard.
All Orthogonal Arrays with 18 Runs
Schoen, E.D.
2009-01-01
All combinatorially inequivalent orthogonal arrays with 18 runs and eight or less factors are generated. Their potential as practical experimental designs is evaluated by a classification using generalized word-length patterns of the original arrays and those of their projections into less factors.
Palm cooling does not improve running performance.
Scheadler, C M; Saunders, N W; Hanson, N J; Devor, S T
2013-08-01
The aim of this study was to test the efficacy of the BEX Runner palm cooling device during a combination of exercise and environmental heat stress. Twelve subjects completed two randomly ordered time-to-exhaustion runs at 75% VO2max, 30 °C, and 50% relative humidity with and without palm cooling. Time to exhaustion runs started once the warm-up had elicited a core temperature of 37.5 °C. Heart rate, Rating of Perceived Exertion, Feeling Scale, and core temperature were recorded at 2-min intervals during each run. Time to exhaustion was longer in control than treatment (46.7±31.1 vs. 41.3±26.3 min, respectively, prate-of-rise of core temperature was not different between control and treatment (0.047 vs. 0.048 °C · min-1, respectively). The use of the BEX Runner palm cooling device during a run in hot conditions did not eliminate or even attenuate the rise in core temperature. Exercise time in hot conditions did not increase with the use of the palm cooling device and time to exhaustion was reduced. © Georg Thieme Verlag KG Stuttgart · New York.
Chaotic inflation with curvaton induced running
Sloth, Martin S
2014-01-01
The apparent tension between the the recent BICEP2 data and the Planck data might be removed by allowing for a large running in the spectral index as suggested by the BICEP2 collaboration, but in disagreement with prediction of the simplest model of chaotic inflation. The large field chaotic model is sensitive to UV physics, and the non-trivial running of the spectral index hinted by the BICEP2 data could therefore be telling us some additional new information about the UV completion of inflation. However, before we can draw such strong conclusions with confidence, we might first have to carefully exclude the alternatives. Assuming monomial chaotic inflation is the right theory of inflation, we therefore explore the possibility that the running could be due to some other less UV sensitive degree of freedom. As an example, we ask if it is possible that the curvature perturbation spectrum has a contribution from a curvaton, which makes up for the large running in the spectrum. We find that this effect could mas...
Jet performance in Run 2 at ATLAS
Kunigo, Takuto; The ATLAS collaboration
2016-01-01
Slides for the talk "Jet performance in Run 2" at BOOST 2016. In this talk, the jet energy calibration sequence ( including in-situ calibrations at $\\sqrt{s} = 13$ TeV ), jet energy scale and resolution uncertainties and the jet calibration plan for 2016 will be presented.
Asperity deformation during running-in
DEFF Research Database (Denmark)
Jakobsen, Jørgen; Sivebæk, Ion Marius
2011-01-01
Asperities loaded in pure rolling against a hard, smooth surface will often be deformed at the first contact event and will thereby experience high normal stress, presumably of a magnitude near the Vickers hardness of the softer material. Continued running-in can be imagined to develop into lower...
Asperity deformation during running-in
DEFF Research Database (Denmark)
Jakobsen, Jørgen; Sivebæk, Ion Marius
2010-01-01
Asperities loaded in pure rolling against a hard, smooth surface will often be deformed at the first contact event and will thereby experience high normal stress, presumably of a magnitude near the Vickers hardness of the softer material. Continued running-in can be imagined to develop into lower...
Directory of Open Access Journals (Sweden)
Thamnoon Rasmeemasmuang
2014-03-01
Full Text Available On occasions, sandbag revetments are temporarily applied to armour sandy beaches from erosion. Nevertheless, an empirical formula to determine the wave run -up height on sandbag slopes has not been available heretofore. In this study a wave run-up formula which considers the roughness of slope surfaces is proposed for the case of sandbag slopes. A series of laboratory experiments on the wave run -up on smooth slopes and sandbag slopes were conducted in a regular-wave flume, leading to the finding of empirical parameters for the formula. The proposed empirical formula is applicable to wave steepness ranging from 0.01 to 0.14 and to the thickness of placed sandbags relative to the wave height ranging from 0.17 to 3.0. The study shows that the wave run-up height computed by the formula for the sandbag slopes is 26-40% lower than that computed by the formula for the smooth slopes.
A luminosity model of RHIC gold runs
Energy Technology Data Exchange (ETDEWEB)
Zhang, S.Y.
2011-11-01
In this note, we present a luminosity model for RHIC gold runs. The model is applied to the physics fills in 2007 run without cooling, and with the longitudinal cooling applied to one beam only. Having good comparison, the model is used to project a fill with the longitudinal cooling applied to both beams. Further development and possible applications of the model are discussed. To maximize the integrated luminosity, usually the higher beam intensity, smaller longitudinal and transverse emittance, and smaller {beta} are the directions to work on. In past 10 years, the RHIC gold runs have demonstrated a path toward this goal. Most recently, a successful commissioning of the bunched beam stochastic cooling, both longitudinal and transverse, has offered a chance of further RHIC luminosity improvement. With so many factors involved, a luminosity model would be useful to identify and project gains in the machine development. In this article, a preliminary model is proposed. In Section 2, several secondary factors, which are not yet included in the model, are identified based on the RHIC operation condition and experience in current runs. In Section 3, the RHIC beam store parameters used in the model are listed, and validated. In Section 4, the factors included in the model are discussed, and the luminosity model is presented. In Section 5, typical RHIC gold fills without cooling, and with partial cooling are used for comparison with the model. Then a projection of fills with more coolings is shown. In Section 6, further development of the model is discussed.
Daytime running lights : its safety evidence revisited.
Koornstra, M.J.
1993-01-01
Retrospective in-depth accident studies from several countries confirm that human perception errors are the main causal factor in road accidents. The share of accident types which are relevant for the effect of daytime running lights (DRL), such as overtaking and crossing accidents, in the total of
The Run-2 ATLAS Trigger System
Ruiz-Martinez, Aranzazu; The ATLAS collaboration
2016-01-01
The ATLAS trigger successfully collected collision data during the first run of the LHC between 2009-2013 at different centre-of-mass energies between 900 GeV and 8 TeV. The trigger system consists of a hardware Level-1 and a software-based high level trigger (HLT) that reduces the event rate from the design bunch-crossing rate of 40 MHz to an average recording rate of a few hundred Hz. In Run-2, the LHC will operate at centre-of-mass energies of 13 and 14 TeV and higher luminosity, resulting in roughly five times higher trigger rates. A brief review of the ATLAS trigger system upgrades that were implemented between Run-1 and Run-2, allowing to cope with the increased trigger rates while maintaining or even improving the efficiency to select physics processes of interest, will be given. This includes changes to the Level-1 calorimeter and muon trigger systems, the introduction of a new Level-1 topological trigger module and the merging of the previously two-level HLT system into a single event filter farm. A ...
Validity of Self-Reported Running Distance.
Dideriksen, Mette; Soegaard, Cristina; Nielsen, Rasmus O
2016-06-01
It is unclear whether there is a difference between subjective evaluation and objective global positioning systems (GPS) measurement of running distance. The purpose of this study was to investigate if such difference exists. A total of 100 participants (51% men; median age, 41.5; body mass, 78.1 kg ±13.8 SD) completed a run of free choice, then subjectively reported the distance in kilometer (km). This information was subsequently compared with the distance derived from a nondifferential GPS watch using paired t-tests and Bland-Altman's 95% limits of agreement. No significant difference was found between the mean paired differences between subjective evaluations and GPS measurements (1.86%, 95% confidence interval = -1.53%; 5.25%, p = 0.96). The Bland-Altman 95% limits of agreement revealed considerable variation (lower limit = -28% and upper limit = 40%). Such variation exceeds the clinical error range of 10%. In conclusion, the mean running distance (km) is similar between self-reporting and GPS measurements. However, researchers should consider using GPS measurements in favor of subjective reporting of running distance because of considerable variation on an individual level.
KINETIC CONSEQUENCES OF CONSTRAINING RUNNING BEHAVIOR
Directory of Open Access Journals (Sweden)
John A. Mercer
2005-06-01
Full Text Available It is known that impact forces increase with running velocity as well as when stride length increases. Since stride length naturally changes with changes in submaximal running velocity, it was not clear which factor, running velocity or stride length, played a critical role in determining impact characteristics. The aim of the study was to investigate whether or not stride length influences the relationship between running velocity and impact characteristics. Eight volunteers (mass=72.4 ± 8.9 kg; height = 1.7 ± 0.1 m; age = 25 ± 3.4 years completed two running conditions: preferred stride length (PSL and stride length constrained at 2.5 m (SL2.5. During each condition, participants ran at a variety of speeds with the intent that the range of speeds would be similar between conditions. During PSL, participants were given no instructions regarding stride length. During SL2.5, participants were required to strike targets placed on the floor that resulted in a stride length of 2.5 m. Ground reaction forces were recorded (1080 Hz as well as leg and head accelerations (uni-axial accelerometers. Impact force and impact attenuation (calculated as the ratio of head and leg impact accelerations were recorded for each running trial. Scatter plots were generated plotting each parameter against running velocity. Lines of best fit were calculated with the slopes recorded for analysis. The slopes were compared between conditions using paired t-tests. Data from two subjects were dropped from analysis since the velocity ranges were not similar between conditions resulting in the analysis of six subjects. The slope of impact force vs. velocity relationship was different between conditions (PSL: 0.178 ± 0.16 BW/m·s-1; SL2.5: -0.003 ± 0.14 BW/m·s-1; p < 0.05. The slope of the impact attenuation vs. velocity relationship was different between conditions (PSL: 5.12 ± 2.88 %/m·s-1; SL2.5: 1.39 ± 1.51 %/m·s-1; p < 0.05. Stride length was an important factor
Duration specific Running performance in Elite Gaelic Football.
Malone, Shane; Solan, Barry; Hughes, Brian; Collins, Kieran
2017-04-25
The aim of the current investigation was to determine the position and duration specific running performance of elite Gaelic football players through the use of a moving average method. Global positioning system data (4-Hz, VX Sport, New-Zealand) were collected from thirty-five (n = 35) elite Gaelic football players across a two season period. A total of 32 competitive matches were analysed with 300 full match play data samples obtained for final analysis. Players were categorised based on positional groups; full-back, half-back, midfield, half- forward and full-forward. The velocity-time curve was analysed for each position using a rolling average method, where maximal values were calculated for ten different time durations (1-10 min) using total distance (m·min), high-speed (m·min) and sprint distance (m·min) across each match. There were large differences between the 1 and 2 min rolling averages and all other rolling average durations. Smaller differences were observed for rolling averages of a greater duration. Midfielders covered significantly more relative total, high speed and sprint distance than other positions across all time periods (p football fluctuates across match-play. These data provide further knowledge of the running requirements of Gaelic football competition and this information can be used to aide coaches and practitioners in adequately preparing athletes for the most demanding periods of play.
LHC Report: Tests of new LHC running modes
Verena Kain for the LHC team
2012-01-01
On 13 September, the LHC collided lead ions with protons for the first time. This outstanding achievement was key preparation for the planned 2013 operation in this mode. Outside of two special physics runs, the LHC has continued productive proton-proton luminosity operation. Celebrating proton-ion collisions. The first week of September added another 1 fb-1 of integrated luminosity to ATLAS’s and CMS’s proton-proton data set. It was a week of good and steady production mixed with the usual collection of minor equipment faults. The peak performance was slightly degraded at the start of the week but thanks to the work of the teams in the LHC injectors the beam brightness – and thus the LHC peak performance – were restored to previous levels by the weekend. The LHC then switched to new running modes and spectacularly proved its potential as a multi-purpose machine. This is due in large part to the LHC equipment and controls, which have been designed wi...
Monte Carlo radiation transport in external beam radiotherapy
Çeçen, Yiğit
2013-01-01
The use of Monte Carlo in radiation transport is an effective way to predict absorbed dose distributions. Monte Carlo modeling has contributed to a better understanding of photon and electron transport by radiotherapy physicists. The aim of this review is to introduce Monte Carlo as a powerful radiation transport tool. In this review, photon and electron transport algorithms for Monte Carlo techniques are investigated and a clinical linear accelerator model is studied for external beam radiot...
Bayesian adaptive Markov chain Monte Carlo estimation of genetic parameters.
Mathew, B; Bauer, A M; Koistinen, P; Reetz, T C; Léon, J; Sillanpää, M J
2012-10-01
Accurate and fast estimation of genetic parameters that underlie quantitative traits using mixed linear models with additive and dominance effects is of great importance in both natural and breeding populations. Here, we propose a new fast adaptive Markov chain Monte Carlo (MCMC) sampling algorithm for the estimation of genetic parameters in the linear mixed model with several random effects. In the learning phase of our algorithm, we use the hybrid Gibbs sampler to learn the covariance structure of the variance components. In the second phase of the algorithm, we use this covariance structure to formulate an effective proposal distribution for a Metropolis-Hastings algorithm, which uses a likelihood function in which the random effects have been integrated out. Compared with the hybrid Gibbs sampler, the new algorithm had better mixing properties and was approximately twice as fast to run. Our new algorithm was able to detect different modes in the posterior distribution. In addition, the posterior mode estimates from the adaptive MCMC method were close to the REML (residual maximum likelihood) estimates. Moreover, our exponential prior for inverse variance components was vague and enabled the estimated mode of the posterior variance to be practically zero, which was in agreement with the support from the likelihood (in the case of no dominance). The method performance is illustrated using simulated data sets with replicates and field data in barley.
Monte Carlo Glauber wounded nucleon model with meson cloud
Zakharov, B G
2016-01-01
We study the effect of the nucleon meson cloud on predictions of the Monte Carlo Glauber wounded nucleon model for $AA$, $pA$, and $pp$ collisions. From the analysis of the data on the charged multiplicity density in $AA$ collisions we find that the meson-baryon Fock component reduces the required fraction of binary collisions by a factor of $\\sim 2$ for Au+Au collisions at $\\sqrt{s}=0.2$ TeV and $\\sim 1.5$ for Pb+Pb collisions at $\\sqrt{s}=2.76$ TeV. For central $AA$ collisions the meson cloud can increase the multiplicity density by $\\sim 16-18$\\%. We give predictions for the midrapidity charged multiplicity density in Pb+Pb collisions at $\\sqrt{s}=5.02$ TeV for the future LHC run 2. We find that the meson cloud has a weak effect on the centrality dependence of the ellipticity $\\epsilon_2$ in $AA$ collisions. For collisions of the deformed uranium nuclei at $\\sqrt{s}=0.2$ TeV we find that the meson cloud may improve somewhat agreement with the data on the dependence of the elliptic flow on the charged multi...
A novel running mechanic's class changes kinematics but not running economy.
Craighead, Daniel H; Lehecka, Nick; King, Deborah L
2014-11-01
A novel method of running technique instruction, Midstance to Midstance Running (MMR), was studied to determine how MMR affected kinematics and running economy (RE) of recreational runners. An experimental pre-post randomized groups design was used. Participants (n = 18) were recreational runners who ran at least 3 days a week and 5 km per run. All testing was performed on a treadmill at 2.8 m·s. The intervention group (n = 9) completed 8 weeks of instruction in MMR; the control group (n = 9) continued running without instruction. The MMR group showed significant decreases in stride length (SL) (p = 0.02) and maximum knee flexion velocity in stance (p = 0.01), and a significant increase in stride rate (SR) (p = 0.02) after 8 weeks. No significant changes were found in heart rate, rating of perceived exertion, or RE. Midstance to Midstance Running was effective in changing SR and SL, but was not effective in changing other kinematic variables such as foot contact position and maximum knee flexion during swing. Midstance to Midstance Running did not affect RE. Evidence suggests that MMR may be an appropriate instructional method for recreational runners trying to decrease SL and increase SR.
Etxebarria, Naroa; Hunt, Julie; Ingham, Steve; Ferguson, Richard
2014-01-01
Triathlon running is affected by prior cycling and power output during triathlon cycling is variable in nature. We compared constant and triathlon-specific variable power cycling and their effect on subsequent submaximal running physiology. Nine well-trained male triathletes (age 24.6 ± 4.6 years, [Formula: see text] 4.5 ± 0.4 L · min(-1); mean ± SD) performed a submaximal incremental run test, under three conditions: no prior exercise and after a 1 h cycling trial at 65% of maximal aerobic power with either a constant or a variable power profile. The variable power protocol involved multiple 10-90 s intermittent efforts at 40-140% maximal aerobic power. During cycling, pulmonary ventilation (22%, ± 14%; mean; ± 90% confidence limits), blood lactate (179%, ± 48%) and rating of perceived exertion (7.3%, ± 10.2%) were all substantially higher during variable than during constant power cycling. At the start of the run, blood lactate was 64%, ± 61% higher after variable compared to constant power cycling, which decreased running velocity at 4 mM lactate threshold by 0.6, ± 0.9 km · h(-1). Physiological responses to incremental running are negatively affected by prior cycling and, to a greater extent, by variable compared to even-paced cycling. Testing and training of triathletes should account foe higher physiological cost of triathlon-specific cycling and its effect on subsequent running.
Dorn, Tim W; Schache, Anthony G; Pandy, Marcus G
2012-06-01
Humans run faster by increasing a combination of stride length and stride frequency. In slow and medium-paced running, stride length is increased by exerting larger support forces during ground contact, whereas in fast running and sprinting, stride frequency is increased by swinging the legs more rapidly through the air. Many studies have investigated the mechanics of human running, yet little is known about how the individual leg muscles accelerate the joints and centre of mass during this task. The aim of this study was to describe and explain the synergistic actions of the individual leg muscles over a wide range of running speeds, from slow running to maximal sprinting. Experimental gait data from nine subjects were combined with a detailed computer model of the musculoskeletal system to determine the forces developed by the leg muscles at different running speeds. For speeds up to 7 m s(-1), the ankle plantarflexors, soleus and gastrocnemius, contributed most significantly to vertical support forces and hence increases in stride length. At speeds greater than 7 m s(-1), these muscles shortened at relatively high velocities and had less time to generate the forces needed for support. Thus, above 7 m s(-1), the strategy used to increase running speed shifted to the goal of increasing stride frequency. The hip muscles, primarily the iliopsoas, gluteus maximus and hamstrings, achieved this goal by accelerating the hip and knee joints more vigorously during swing. These findings provide insight into the strategies used by the leg muscles to maximise running performance and have implications for the design of athletic training programs.
Muscle injury after low-intensity downhill running reduces running economy.
Baumann, Cory W; Green, Michael S; Doyle, J Andrew; Rupp, Jeffrey C; Ingalls, Christopher P; Corona, Benjamin T
2014-05-01
Contraction-induced muscle injury may reduce running economy (RE) by altering motor unit recruitment, lowering contraction economy, and disturbing running mechanics, any of which may have a deleterious effect on endurance performance. The purpose of this study was to determine if RE is reduced 2 days after performing injurious, low-intensity exercise in 11 healthy active men (27.5 ± 5.7 years; 50.05 ± 1.67 VO2peak). Running economy was determined at treadmill speeds eliciting 65 and 75% of the individual's peak rate of oxygen uptake (VO2peak) 1 day before and 2 days after injury induction. Lower extremity muscle injury was induced with a 30-minute downhill treadmill run (6 × 5 minutes runs, 2 minutes rest, -12% grade, and 12.9 km·h(-1)) that elicited 55% VO2peak. Maximal quadriceps isometric torque was reduced immediately and 2 days after the downhill run by 18 and 10%, and a moderate degree of muscle soreness was present. Two days after the injury, steady-state VO2 and metabolic work (VO2 L·km(-1)) were significantly greater (4-6%) during the 65% VO2peak run. Additionally, postinjury VCO2, VE and rating of perceived exertion were greater at 65% but not at 75% VO2peak, whereas whole blood-lactate concentrations did not change pre-injury to postinjury at either intensity. In conclusion, low-intensity downhill running reduces RE at 65% but not 75% VO2peak. The results of this study and other studies indicate the magnitude to which RE is altered after downhill running is dependent on the severity of the injury and intensity of the RE test.
The ATLAS Tau Trigger Performance during LHC Run 1 and Prospects for Run 2
Sakurai, Yuki
2014-01-01
Triggering on hadronic tau decays is essential for a wide variety of analyses of interesting physics processes at ATLAS. The ATLAS tau trigger combines information from the tracking detectors and calorimeters to identify the signature of hadronically decaying tau leptons. In Run 2 operation expected to start in 2015, the trigger strategies will become more important than ever before. In this paper, the tau trigger performance during Run 1 is summarized and also an overview of the developments of Run 2 tau trigger strategy is presented.
The ATLAS Tau Trigger Performance during LHC Run1 and Prospects for Run2
Sakurai, Y; The ATLAS collaboration
2014-01-01
Triggering on hadronic tau decays is essential for a wide variety of analyses of interesting physics processes at ATLAS. The ATLAS tau trigger combines information from the tracking detectors and calorimeters to identify the signature of hadronically decaying tau leptons. In Run2 operation expected to start in 2015, the trigger strategies will become more important than ever before. In this paper, the tau trigger performance during Run1 is summarized and also an overview of the developments of Run2 tau trigger strategy is presented.
Run scenarios for the linear collider
Energy Technology Data Exchange (ETDEWEB)
M. Battaglia et al.
2002-12-23
We have examined how a Linear Collider program of 1000 fb{sup -1} could be constructed in the case that a very rich program of new physics is accessible at {radical}s {le} 500 GeV. We have examined possible run plans that would allow the measurement of the parameters of a 120 GeV Higgs boson, the top quark, and could give information on the sparticle masses in SUSY scenarios in which many states are accessible. We find that the construction of the run plan (the specific energies for collider operation, the mix of initial state electron polarization states, and the use of special e{sup -}e{sup -} runs) will depend quite sensitively on the specifics of the supersymmetry model, as the decay channels open to particular sparticles vary drastically and discontinuously as the underlying SUSY model parameters are varied. We have explored this dependence somewhat by considering two rather closely related SUSY model points. We have called for operation at a high energy to study kinematic end points, followed by runs in the vicinity of several two body production thresholds once their location is determined by the end point studies. For our benchmarks, the end point runs are capable of disentangling most sparticle states through the use of specific final states and beam polarizations. The estimated sparticle mass precisions, combined from end point and scan data, are given in Table VIII and the corresponding estimates for the mSUGRA parameters are in Table IX. The precision for the Higgs boson mass, width, cross-sections, branching ratios and couplings are given in Table X. The errors on the top quark mass and width are expected to be dominated by the systematic limits imposed by QCD non-perturbative effects. The run plan devotes at least two thirds of the accumulated luminosity near the maximum LC energy, so that the program would be sensitive to unexpected new phenomena at high mass scales. We conclude that with a 1 ab{sup -1} program, expected to take the first 6-7 years
DESIGN IMPROVEMENT OF THE LOCOMOTIVE RUNNING GEARS
Directory of Open Access Journals (Sweden)
S. V. Myamlin
2013-09-01
Full Text Available Purpose. To determine the dynamic qualities of the mainline freight locomotives characterizing the safe motion in tangent and curved track sections at all operational speeds, one needs a whole set of studies, which includes a selection of the design scheme, development of the corresponding mathematical model of the locomotive spatial fluctuations, construction of the computer calculation program, conducting of the theoretical and then experimental studies of the new designs. In this case, one should compare the results with existing designs. One of the necessary conditions for the qualitative improvement of the traction rolling stock is to define the parameters of its running gears. Among the issues related to this problem, an important place is occupied by the task of determining the locomotive dynamic properties on the stage of projection, taking into account the selected technical solutions in the running gear design. Methodology. The mathematical modeling studies are carried out by the numerical integration method of the dynamic loading for the mainline locomotive using the software package «Dynamics of Rail Vehicles » («DYNRAIL». Findings. As a result of research for the improvement of locomotive running gear design it can be seen that the creation of the modern locomotive requires from engineers and scientists the realization of scientific and technical solutions. The solutions enhancing design speed with simultaneous improvement of the traction, braking and dynamic qualities to provide a simple and reliable design, especially the running gear, reducing the costs for maintenance and repair, low initial cost and operating costs for the whole service life, high traction force when starting, which is as close as possible to the ultimate force of adhesion, the ability to work in multiple traction mode and sufficient design speed. Practical Value. The generalization of theoretical, scientific and methodological, experimental studies aimed
Run scenarios for the linear collider
Energy Technology Data Exchange (ETDEWEB)
M. Battaglia et al.
2002-12-23
We have examined how a Linear Collider program of 1000 fb{sup -1} could be constructed in the case that a very rich program of new physics is accessible at {radical}s {le} 500 GeV. We have examined possible run plans that would allow the measurement of the parameters of a 120 GeV Higgs boson, the top quark, and could give information on the sparticle masses in SUSY scenarios in which many states are accessible. We find that the construction of the run plan (the specific energies for collider operation, the mix of initial state electron polarization states, and the use of special e{sup -}e{sup -} runs) will depend quite sensitively on the specifics of the supersymmetry model, as the decay channels open to particular sparticles vary drastically and discontinuously as the underlying SUSY model parameters are varied. We have explored this dependence somewhat by considering two rather closely related SUSY model points. We have called for operation at a high energy to study kinematic end points, followed by runs in the vicinity of several two body production thresholds once their location is determined by the end point studies. For our benchmarks, the end point runs are capable of disentangling most sparticle states through the use of specific final states and beam polarizations. The estimated sparticle mass precisions, combined from end point and scan data, are given in Table VIII and the corresponding estimates for the mSUGRA parameters are in Table IX. The precision for the Higgs boson mass, width, cross-sections, branching ratios and couplings are given in Table X. The errors on the top quark mass and width are expected to be dominated by the systematic limits imposed by QCD non-perturbative effects. The run plan devotes at least two thirds of the accumulated luminosity near the maximum LC energy, so that the program would be sensitive to unexpected new phenomena at high mass scales. We conclude that with a 1 ab{sup -1} program, expected to take the first 6-7 years
Efficient heterogeneous execution of Monte Carlo shielding calculations on a Beowulf cluster.
Dewar, David; Hulse, Paul; Cooper, Andrew; Smith, Nigel
2005-01-01
Recent work has been done in using a high-performance 'Beowulf' cluster computer system for the efficient distribution of Monte Carlo shielding calculations. This has enabled the rapid solution of complex shielding problems at low cost and with greater modularity and scalability than traditional platforms. The work has shown that a simple approach to distributing the workload is as efficient as using more traditional techniques such as PVM (Parallel Virtual Machine). In addition, when used in an operational setting this technique is fairer with the use of resources than traditional methods, in that it does not tie up a single computing resource but instead shares the capacity with other tasks. These developments in computing technology have enabled shielding problems to be solved that would have taken an unacceptably long time to run on traditional platforms. This paper discusses the BNFL Beowulf cluster and a number of tests that have recently been run to demonstrate the efficiency of the asynchronous technique in running the MCBEND program. The BNFL Beowulf currently consists of 84 standard PCs running RedHat Linux. Current performance of the machine has been estimated to be between 40 and 100 Gflop s(-1). When the whole system is employed on one problem up to four million particles can be tracked per second. There are plans to review its size in line with future business needs.
28 CFR 544.34 - Inmate running events.
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Inmate running events. 544.34 Section 544... EDUCATION Inmate Recreation Programs § 544.34 Inmate running events. Running events will ordinarily not... available for all inmate running events....
Run-up distributions of waves breaking on sloping walls
Battjes, J.A.
1969-01-01
Distributions of run-up are calculated by assigning to each individual wave in an irregular wave train a run-up value according to Hunt's formula. The use of this formula permits a normalization of the run-up in such a way that the run-up distributions are independent of slope angle, mean wave
CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations
Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei
2014-12-01
We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.
Adaptive Multi-GPU Exchange Monte Carlo for the 3D Random Field Ising Model
Navarro, C A; Deng, Youjin
2015-01-01
The study of disordered spin systems through Monte Carlo simulations has proven to be a hard task due to the adverse energy landscape present at the low temperature regime, making it difficult for the simulation to escape from a local minimum. Replica based algorithms such as the Exchange Monte Carlo (also known as parallel tempering) are effective at overcoming this problem, reaching equilibrium on disordered spin systems such as the Spin Glass or Random Field models, by exchanging information between replicas of neighbor temperatures. In this work we present a multi-GPU Exchange Monte Carlo method designed for the simulation of the 3D Random Field Model. The implementation is based on a two-level parallelization scheme that allows the method to scale its performance in the presence of faster and GPUs as well as multiple GPUs. In addition, we modified the original algorithm by adapting the set of temperatures according to the exchange rate observed from short trial runs, leading to an increased exchange rate...
CMS computing operations during run 1
Adelman, J; Artieda, J; Bagliese, G; Ballestero, D; Bansal, S; Bauerdick, L; Behrenhof, W; Belforte, S; Bloom, K; Blumenfeld, B; Blyweert, S; Bonacorsi, D; Brew, C; Contreras, L; Cristofori, A; Cury, S; da Silva Gomes, D; Dolores Saiz Santos, M; Dost, J; Dykstra, D; Fajardo Hernandez, E; Fanzango, F; Fisk, I; Flix, J; Georges, A; Gi ffels, M; Gomez-Ceballos, G; Gowdy, S; Gutsche, O; Holzman, B; Janssen, X; Kaselis, R; Kcira, D; Kim, B; Klein, D; Klute, M; Kress, T; Kreuzer, P; Lahi , A; Larson, K; Letts, J; Levin, A; Linacre, J; Linares, J; Liu, S; Luyckx, S; Maes, M; Magini, N; Malta, A; Marra Da Silva, J; Mccartin, J; McCrea, A; Mohapatra, A; Molina, J; Mortensen, T; Padhi, S; Paus, C; Piperov, S; Ralph; Sartirana, A; Sciaba, A; S ligoi, I; Spinoso, V; Tadel, M; Traldi, S; Wissing, C; Wuerthwein, F; Yang, M; Zielinski, M; Zvada, M
2014-01-01
During the first run, CMS collected and processed more than 10B data events and simulated more than 15B events. Up to 100k processor cores were used simultaneously and 100PB of storage was managed. Each month petabytes of data were moved and hundreds of users accessed data samples. In this document we discuss the operational experience from this first run. We present the workflows and data flows that were executed, and we discuss the tools and services developed, and the operations and shift models used to sustain the system. Many techniques were followed from the original computing planning, but some were reactions to difficulties and opportunities. We also address the lessons learned from an operational perspective, and how this is shaping our thoughts for 2015.
Hildreth, M; Lange, D J; Kortelainen, M J
2015-01-01
During LHC shutdown between run-1 and run-2 intensive developments were carried out to improve performance of CMS simulation. For physics improvements migration from Geant4 9.4p03 to Geant4 10.0p02 has been performed. CPU performance has been improved by introduction of the Russian roulette method inside CMS calorimeters, optimization of CMS simulation sub-libraries, and usage of statics build of the simulation executable. As a result of these efforts, CMS simulation has been speeded up by about factor two. In this work we provide description of updates for different software components of CMS simulation. Development of a multi-threaded (MT) simulation approach for CMS will be also discuss.
The CDF Run IIb silicon detector
Energy Technology Data Exchange (ETDEWEB)
Aoki, M.; Bacchetta, N.; Behari, S.; Benjamin, D.; Bisello, D.; Bolla, G.; Bortoletto, D.; Burghard, A.; Busetto, G.; Cabrera, S.; Canepa, A.; Castro, A.; Cardoso, G.; Chertok, M.; Ciobanu, C.; Derylo, G.; Fang, I.; Flaugher, B. E-mail: brenna@fnal.gov; Freeman, J.; Galtieri, L.; Galyardt, J.; Garcia-Sciveres, M.; Giurgiu, G.; Gorelov, I.; Haber, C.; Hara, K.; Hoeferkamp, M.; Holbrook, B.; Hrycyk, M.; Junk, T.; Kim, S.; Kobayashi, K.; Krieger, B.; Kruse, M.; Lander, R.; Lu, R.-S.; Lukens, P.; Malferrari, L.; Manea, C.; Margotti, A.; Maksimovic, P.; Merkel, P.; Moccia, S.; Nakano, I.; Naoumov, D.; Novak, J.; Okusawa, T.; Orlov, Y.; Pancaldi, G.; Pantano, D.; Pavlicek, V.; Pellett, D.; Seidel, S.; Semeria, F.; Takei, Y.; Tanaka, R.; Wang, Z.; Watje, P.; Weber, M.; Wester, W.; Wilkes, T.; Yamamoto, K.; Yao, W.; Zimmermann, S.; Zucchelli, S.; Zucchini, A
2004-02-01
Fermilab plans to deliver 5-15 fb{sup -1} of integrated luminosity to the CDF and D0 experiments. The current inner silicon detectors at CDF (SVXIIa and L00) will not tolerate the radiation dose associated with high-luminosity running and will need to be replaced. A new readout chip (SVX4) has been designed in radiation-hard 0.25 {mu}m, CMOS technology. Single-sided sensors are arranged in a compact structure, called a stave, with integrated readout and cooling systems. This paper describes the general design of the Run IIb system, testing results of prototype electrical components (staves), and prototype silicon sensor performance before and after irradiation.
The Millennium Run Observatory: First Light
Overzier, R; Angulo, R E; Bertin, E; Blaizot, J; Henriques, B M B; Marleau, G -D; White, S D M
2012-01-01
Simulations of galaxy evolution aim to capture our current understanding as well as to make predictions for testing by future experiments. Simulations and observations are often compared in an indirect fashion: physical quantities are estimated from the data and compared to models. However, many applications can benefit from a more direct approach, where the observing process is also simulated and the models are seen fully from the observer's perspective. To facilitate this, we have developed the Millennium Run Observatory (MRObs), a theoretical virtual observatory which uses virtual telescopes to `observe' semi-analytic galaxy formation models based on the suite of Millennium Run dark matter simulations. The MRObs produces data that can be processed and analyzed using the standard software packages developed for real observations. At present, we produce images in forty filters from the rest-frame UV to IR for two stellar population synthesis models, three different models of IGM absorption, and two cosmologi...
Instrumental Variables in the Long Run
DEFF Research Database (Denmark)
Casey, Gregory; Klemp, Marc Patrick Brag
2017-01-01
In the study of long-run economic growth, it is common to use historical or geographical variables as instruments for contemporary endogenous regressors. We study the interpretation of these conventional instrumental variable (IV) regressions in a general, yet simple, framework. Our aim...... is to estimate the long-run causal effect of changes in the endogenous explanatory variable. We find that conventional IV regressions generally cannot recover this parameter of interest. To estimate this parameter, therefore, we develop an augmented IV estimator that combines the conventional regression...... with a separate regression estimating the degree of persistence in the endogenous regressor. Importantly, our estimator can overcome a particular violation of the exclusion restriction that can arise when there is a time gap between the instrument and the endogenous explanatory variable. We apply our results...
Measuring the running top-quark mass
Energy Technology Data Exchange (ETDEWEB)
Langenfeld, U.; Moch, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Uwer, P. [Berlin Univ. (Germany). Inst. fuer Physik
2009-06-15
We present the first direct determination of the running top-quark mass based on the total cross section of top-quark pair-production as measured at the Tevatron. Our theory prediction for the cross section includes various next-to-next-to-leading order QCD contributions, in particular all logarithmically enhanced terms near threshold, the Coulomb corrections at two loops and all explicitly scale dependent terms at NNLO accuracy. The result allows for an exact and independent variation of the renormalization and factorization scales. For Tevatron and LHC we study its dependence on all scales, on the parton luminosity and on the top-quark mass using both the conventional pole mass definition as well as the running mass in the MS scheme. We extract for the top-quark an MS mass of m({mu}=m) =160.0{sup +3.3}{sub -3.2} GeV. (orig.)
Energy Technology Data Exchange (ETDEWEB)
O' Brien, M. J.; Brantley, P. S.
2015-01-20
In order to run Monte Carlo particle transport calculations on new supercomputers with hundreds of thousands or millions of processors, care must be taken to implement scalable algorithms. This means that the algorithms must continue to perform well as the processor count increases. In this paper, we examine the scalability of:(1) globally resolving the particle locations on the correct processor, (2) deciding that particle streaming communication has finished, and (3) efficiently coupling neighbor domains together with different replication levels. We have run domain decomposed Monte Carlo particle transport on up to 2^{21} = 2,097,152 MPI processes on the IBM BG/Q Sequoia supercomputer and observed scalable results that agree with our theoretical predictions. These calculations were carefully constructed to have the same amount of work on every processor, i.e. the calculation is already load balanced. We also examine load imbalanced calculations where each domain’s replication level is proportional to its particle workload. In this case we show how to efficiently couple together adjacent domains to maintain within workgroup load balance and minimize memory usage.
Ergogenic effect of music during running performance
Van Dyck, Edith; Leman, Marc
2016-01-01
In running competitions portable music players and headphones are often banned. In some cases, runners have been disqualified after using such devices during competition. In this paper, it is discussed whether, aside from possible safety reasons, such competition regulations make sense and whether music can have an ergogenic effect on performance. Although a definitive conclusion on the regulation matter is not of our concern here, we review evidence of the fact that music is capable of enhan...
FPU-Supported Running Error Analysis
T. Zahradnický; R. Lórencz
2010-01-01
A-posteriori forward rounding error analyses tend to give sharper error estimates than a-priori ones, as they use actual data quantities. One of such a-posteriori analysis – running error analysis – uses expressions consisting of two parts; one generates the error and the other propagates input errors to the output. This paper suggests replacing the error generating term with an FPU-extracted rounding error estimate, which produces a sharper error bound.
1987 DOE review: First collider run operation
Energy Technology Data Exchange (ETDEWEB)
Childress, S.; Crawford, J.; Dugan, G.; Edwards, H.; Finley, D.A.; Fowler, W.B.; Harrison, M.; Holmes, S.; Makara, J.N.; Malamud, E.
1987-05-01
This review covers the operations of the first run of the 1.8 TeV superconducting super collider. The papers enclosed cover: PBAR source status, fixed target operation, Tevatron cryogenic reliability and capacity upgrade, Tevatron Energy upgrade progress and plans, status of the D0 low beta insertion, 1.8 K and 4.7 K refrigeration for low-..beta.. quadrupoles, progress and plans for the LINAC and booster, near term and long term and long term performance improvements.
Xiamen Runs Faster with Marathon Competition
Institute of Scientific and Technical Information of China (English)
MaZhijuan; YeShaojin
2005-01-01
On March 26, 15,920 contestants were running on Xiamen's Island Loop Road, dubbed as the world's most beautiful race lane. Along the 42.195kilometer lane, some 300 thousand spectators shouted and applauded for those runners. That was the Third Xiamen International Marathon Competition, which was called by local people """"a big festival"""". There has been never a sport game that makes the city so enthusiastic.
The anatomy and biomechanics of running.
Nicola, Terry L; Jewison, David J
2012-04-01
To understand the normal series of biomechanical events of running, a comparative assessment to walking is helpful. Closed kinetic chain through the lower extremities, control of the lumbopelvic mechanism, and overall symmetry of movement has been described well enough that deviations from normal movement can now be associated with specific overuse injuries experienced by runners. This information in combination with a history of the runner's errors in their training program will lead to a more comprehensive treatment and prevention plan for related injuries.
Footwear and running cardio-respiratory responses.
Rubin, D A; Butler, R J; Beckman, B; Hackney, A C
2009-05-01
This study compared cardio-respiratory responses during running wearing a motion control shoe (MC) or a cushioning shoe (CU) in a cross-over single blinded design. Fourteen runners (10F/4M, age=27.3+/-5.1 years, body mass=64.1+/-12.2 kg, height=167.8+/-7.5 cm, VO (2)max=52.3+/-8.8 ml/kg/min) completed a 40-min run at approximately 65% VO (2) max under both shoe conditions. Oxygen uptake (mL/kg/min; L/min), minute ventilation (L/min), respiratory exchange ratio, and heart rate were measured at minutes 8-10, 18-20, 28-30 and 38-40 of exercise. Rating of perceived exertion was obtained at minutes 10, 20, 30 and 40. Two (footwear) by four (time) repeated measures ANOVAs showed no differences between footwear conditions in overall oxygen consumption (MC=36.8+/-1.5 vs. CU=35.3+/-1.4 mL/kg/min, p=0.143), minute ventilation (MC=50.4+/-4 vs. CU=48.5+/-3.8, p=0.147), respiratory exchange ratio (MC=0.90+/-0.01 vs. CU=0.89+/-0.01, p=0.331), heart rate (MC=159+/-3 vs. CU=160+/-3, p=0.926), or rate of perceived exertion. The design of motion control footwear does not appear to affect cardio-respiratory or perceived exertion responses during submaximal running. The findings are specific to the shoes tested. Nonetheless, the outcomes suggest that footwear selection to reduce certain overuse injuries does not increase the work of running.
Proposal for a running coupling JIMWLK equation
Lappi, T
2014-01-01
In the CGC framework the initial stages of a heavy ion collision at high energy are described as "glasma" field configurations. The initial condition for these evolving fields depends, in the CGC effective theory, on a probability distribution for color charges. The energy dependence of this distribution can be calculated from the JIMWLK renormalization group equation. We discuss recent work on a practical implementation of the running coupling constant in the Langevin method of solving the JIMWLK equation.
The Aerodynamic Signature of Running Spiders
Jérôme Casas; Thomas Steinmann; Olivier Dangles
2008-01-01
International audience; Many predators display two foraging modes, an ambush strategy and a cruising mode. These foraging strategies have been classically studied in energetic, biomechanical and ecological terms, without considering the role of signals produced by predators and perceived by prey. Wolf spiders are a typical example; they hunt in leaf litter either using an ambush strategy or by moving at high speed, taking over unwary prey. Air flow upstream of running spiders is a source of i...
Running with a powered knee and ankle prosthesis.
Shultz, Amanda H; Lawson, Brian E; Goldfarb, Michael
2015-05-01
This paper presents a running control architecture for a powered knee and ankle prosthesis that enables a transfemoral amputee to run with a biomechanically appropriate running gait and to intentionally transition between a walking and running gait. The control architecture consists firstly of a coordination level controller, which provides gait biomechanics representative of healthy running, and secondly of a gait selection controller that enables the user to intentionally transition between a running and walking gait. The running control architecture was implemented on a transfemoral prosthesis with powered knee and ankle joints, and the efficacy of the controller was assessed in a series of running trials with a transfemoral amputee subject. Specifically, treadmill trials were conducted to assess the extent to which the coordination controller provided a biomechanically appropriate running gait. Separate trials were conducted to assess the ability of the user to consistently and reliably transition between walking and running gaits.
CMS Strip Detector: Operational Experience and Run1 to Run2 Transition
Butz, Erik Manuel
2014-01-01
The CMS silicon strip tracker is the largest silicon detector ever built. It has an active area of 200~m$^2$ of silicon segmented into almost 10 million readout channels. We describe some operational aspects of the system during its first years of operation during the LHC run 1. During the long shutdown 1 of the LHC an extensive work program was carried out on the strip tracker services in order to facilitate operation of the system at sub-zero temperatures in the LHC run~2 and beyond. We will describe these efforts and give a motivation of the choice of run~2 operating temperature. Finally, a brief outlook on the operation of the system in the upcoming run~2 will be given.
Running vacuum cosmological models: linear scalar perturbations
Perico, E. L. D.; Tamayo, D. A.
2017-08-01
In cosmology, phenomenologically motivated expressions for running vacuum are commonly parameterized as linear functions typically denoted by Λ(H2) or Λ(R). Such models assume an equation of state for the vacuum given by bar PΛ = - bar rhoΛ, relating its background pressure bar PΛ with its mean energy density bar rhoΛ ≡ Λ/8πG. This equation of state suggests that the vacuum dynamics is due to an interaction with the matter content of the universe. Most of the approaches studying the observational impact of these models only consider the interaction between the vacuum and the transient dominant matter component of the universe. We extend such models by assuming that the running vacuum is the sum of independent contributions, namely bar rhoΛ = Σibar rhoΛi. Each Λ i vacuum component is associated and interacting with one of the i matter components in both the background and perturbation levels. We derive the evolution equations for the linear scalar vacuum and matter perturbations in those two scenarios, and identify the running vacuum imprints on the cosmic microwave background anisotropies as well as on the matter power spectrum. In the Λ(H2) scenario the vacuum is coupled with every matter component, whereas the Λ(R) description only leads to a coupling between vacuum and non-relativistic matter, producing different effects on the matter power spectrum.
The Run-Up of Subduction Zones
Riquelme, S.; Bravo, F. J.; Fuentes, M.; Matias, M.; Medina, M.
2016-12-01
Large earthquakes in subduction zones are liable to produce tsunamis that can cause destruction and fatalities. The Run-up is a geophysical parameter that quantifies damage and if critical facilities or population are exposed to. Here we use the coupling for certain subduction regions measured by different techniques (Potency and GPS observations) to define areas where large earthquakes can occur. Taking the slab 1.0 from the United States Geological Survey (USGS), we can define the geometry of the area including its tsunamigenic potential. By using stochastic earthquakes sources for each area with its maximum tsunamigenic potential, we calculate the numerical and analytical run-up for each case. Then, we perform a statistical analysis and calculate the envelope for both methods. Furthermore, we build an index of risk using: the closest slope to the shore in a piecewise linear approach (last slopecriteria) and the outputsfrom tsunami modeling. Results show that there are areas prone to produce higher run-up than others based on the size of the earthquake, geometrical constraints of the source, tectonic setting and the coast last slope. Based on these results, there are zones that have low risk index which can define escape routes or secure coastal areas for tsunami early warning, urban and planning purposes when detailed data is available.
Constructing predictive models of human running.
Maus, Horst-Moritz; Revzen, Shai; Guckenheimer, John; Ludwig, Christian; Reger, Johann; Seyfarth, Andre
2015-02-06
Running is an essential mode of human locomotion, during which ballistic aerial phases alternate with phases when a single foot contacts the ground. The spring-loaded inverted pendulum (SLIP) provides a starting point for modelling running, and generates ground reaction forces that resemble those of the centre of mass (CoM) of a human runner. Here, we show that while SLIP reproduces within-step kinematics of the CoM in three dimensions, it fails to reproduce stability and predict future motions. We construct SLIP control models using data-driven Floquet analysis, and show how these models may be used to obtain predictive models of human running with six additional states comprising the position and velocity of the swing-leg ankle. Our methods are general, and may be applied to any rhythmic physical system. We provide an approach for identifying an event-driven linear controller that approximates an observed stabilization strategy, and for producing a reduced-state model which closely recovers the observed dynamics. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Towards a measurement of the spectral runnings
Muñoz, Julian B; Raccanelli, Alvise; Kamionkowski, Marc; Silk, Joseph
2016-01-01
Single-field slow-roll inflation predicts a nearly scale-free power spectrum of perturbations, as observed at the scales accessible to current cosmological experiments. This spectrum is slightly red, showing a tilt $(1-n_s)\\sim 0.04$. A direct consequence of this tilt are nonvanishing runnings $\\alpha_s=\\mathrm d n_s/\\mathrm d\\log k$, and $\\beta_s=\\mathrm d\\alpha_s/\\mathrm d\\log k$, which in the minimal inflationary scenario should reach absolute values of $10^{-3}$ and $10^{-5}$, respectively. In this work we calculate how well future surveys can measure these two runnings. We consider a Stage-4 (S4) CMB experiment and show that it will be able to detect significant deviations from the inflationary prediction for $\\alpha_s$, although not for $\\beta_s$. Adding to the S4 CMB experiment the information from a WFIRST-like, a DESI-like, or a SKA-like galaxy survey improves the sensitivity to the runnings by $\\sim$ 5\\%, 15\\%, and 25\\%, respectively. A spectroscopic survey with a billion objects, such as SKA2, will...
The energetics of ultra-endurance running.
Lazzer, Stefano; Salvadego, Desy; Rejc, Enrico; Buglione, Antonio; Antonutto, Guglielmo; di Prampero, Pietro Enrico
2012-05-01
Our objective was to determine the effects of long-lasting endurance events on the energy cost of running (C(r)), and the role of maximal oxygen uptake (VO(2max)), its fractional utilisation (F) and C(r) in determining the performance. Ten healthy runners (age range 26-59 years) participated in an ultra-endurance competition consisting of three running laps of 22, 48 and 20 km on three consecutive days in the North-East of Italy. Anthropometric characteristics and VO(2max) by a graded exercise test on a treadmill were determined 5 days before and 5 days after the competition. In addition, C(r) was determined on a treadmill before and after each running lap. Heart rate (HR) was recorded throughout the three laps. Results revealed that mean C(r) of the individual laps did not increase significantly with lap number (P = 0.200), thus ruling out any chronic lap effect. Even so, however, at the end of lap 3, C(r) was 18.0% (P increase of C(r-mean) during the competition yields to marked worsening of the performance, and (2) the three variables F, VO(2max) and C(r-mean) combined as described above explaining 87% of the total competition time variance.
The aerodynamic signature of running spiders.
Directory of Open Access Journals (Sweden)
Jérôme Casas
Full Text Available Many predators display two foraging modes, an ambush strategy and a cruising mode. These foraging strategies have been classically studied in energetic, biomechanical and ecological terms, without considering the role of signals produced by predators and perceived by prey. Wolf spiders are a typical example; they hunt in leaf litter either using an ambush strategy or by moving at high speed, taking over unwary prey. Air flow upstream of running spiders is a source of information for escaping prey, such as crickets and cockroaches. However, air displacement by running arthropods has not been previously examined. Here we show, using digital particle image velocimetry, that running spiders are highly conspicuous aerodynamically, due to substantial air displacement detectable up to several centimetres in front of them. This study explains the bimodal distribution of spider's foraging modes in terms of sensory ecology and is consistent with the escape distances and speeds of cricket prey. These findings may be relevant to the large and diverse array of arthropod prey-predator interactions in leaf litter.
The aerodynamic signature of running spiders.
Casas, Jérôme; Steinmann, Thomas; Dangles, Olivier
2008-05-07
Many predators display two foraging modes, an ambush strategy and a cruising mode. These foraging strategies have been classically studied in energetic, biomechanical and ecological terms, without considering the role of signals produced by predators and perceived by prey. Wolf spiders are a typical example; they hunt in leaf litter either using an ambush strategy or by moving at high speed, taking over unwary prey. Air flow upstream of running spiders is a source of information for escaping prey, such as crickets and cockroaches. However, air displacement by running arthropods has not been previously examined. Here we show, using digital particle image velocimetry, that running spiders are highly conspicuous aerodynamically, due to substantial air displacement detectable up to several centimetres in front of them. This study explains the bimodal distribution of spider's foraging modes in terms of sensory ecology and is consistent with the escape distances and speeds of cricket prey. These findings may be relevant to the large and diverse array of arthropod prey-predator interactions in leaf litter.
Split-phase motor running as capacitor starts motor and as capacitor run motor
2016-01-01
In this paper, the input parameters of a single phase split-phase induction motor is taken to investigate and to study the output performance characteristics of capacitor start and capacitor run induction motor. The value of these input parameters are used in the design characteristics of capacitor run and capacitor start motor with each motor connected to rated or standard capacitor in series with auxiliary winding or starting winding respectively for the normal operational condition. The ma...
Prophylactic ankle taping: influence on treadmill-running kinematics and running economy.
Paulson, Sally; Braun, William A
2014-02-01
Prophylactic ankle taping (PAT) is commonly used in sport. Prophylactic ankle taping may restrict ankle motion, which would affect the kinetic chain and alter gait. The purpose of this study was to examine the effects of PAT on lower extremity (LE) kinematics and running economy during treadmill running. Twelve recreational runners (9 women, 3 men; M ± SD age = 31.33 ± 8.04 years, height = 1.67 ± 0.81 m, mass = 61.84 ± 9.38 kg) completed two 20-minute running sessions (PAT and no tape: control [CON]) at a self-selected pace. Before each run, reflective markers were placed along the right side of the body. Sagittal plane kinematic data (60 Hz) were captured 4 times, and expired gases were measured for 2-minute after each video capture during both trials. Stride frequency, stride length, LE kinematic variables at initial contact and end contact (EC) were calculated. Cardiorespiratory variables and heart rate were also measured. Running economy was normalized to oxygen uptake per unit body mass per kilometer (milliliter per kilogram per kilometer) as running speeds varied. At EC, the PAT hip angle significantly decreased (p = 0.01) by 3.82°, whereas CON decreased by 0.85°. The range of motion tended to decrease over the 20-minute run (p = 0.08). Heart rate significantly increased over time (6.7%) but was not different between conditions. Prophylactic ankle taping did not significantly affect the physiological measures associated with the metabolic cost of treadmill running or the other kinematic variables. These findings suggest that the hip angle continued to decrease during the PAT condition at push-off in recreational runners without impacting the metabolic cost of transport.
Peak treadmill running velocity during the VO2 max test predicts running performance.
Noakes, T D; Myburgh, K H; Schall, R
1990-01-01
Twenty specialist marathon runners and 23 specialist ultra-marathon runners underwent maximal exercise testing to determine the relative value of maximum oxygen consumption (VO2max), peak treadmill running velocity, running velocity at the lactate turnpoint, VO2 at 16 km h-1, % VO2max at 16 km h-1, and running time in other races, for predicting performance in races of 10-90 km. Race time at 10 or 21.1 km was the best predictor of performance at 42.2 km in specialist marathon runners and at 42.2 and 90 km in specialist ultra-marathon runners (r = 0.91-0.97). Peak treadmill running velocity was the best laboratory-measured predictor of performance (r = -0.88(-)-0.94) at all distances in ultra-marathon specialists and at all distances except 42.2 km in marathon specialists. Other predictive variables were running velocity at the lactate turnpoint (r = -0.80(-)-0.92); % VO2max at 16 km h-1 (r = 0.76-0.90) and VO2max (r = 0.55(-)-0.86). Peak blood lactate concentrations (r = 0.68-0.71) and VO2 at 16 km h-1 (r = 0.10-0.61) were less good predictors. These data indicate: (i) that in groups of trained long distance runners, the physiological factors that determine success in races of 10-90 km are the same; thus there may not be variables that predict success uniquely in either 10 km, marathon or ultra-marathon runners, and (ii) that peak treadmill running velocity is at least as good a predictor of running performance as is the lactate turnpoint. Factors that determine the peak treadmill running velocity are not known but are not likely to be related to maximum rates of muscle oxygen utilization.
Oil shale project run summary for small retort Run S-10
Energy Technology Data Exchange (ETDEWEB)
Ackerman, F.J.; Sandholtz, W.A.; Raley, J.H.; Laswell, B.H. (eds.)
1978-06-01
A combustion run using sidewall heaters to control heat loss and computer control to set heater power were conducted to study the effectiveness of the heater control system, compare results with a one-dimensional retort model when radial heat loss is not significant, and determine effects of recycling off-gas to the retort (by comparison with future runs). It is concluded that adequate simulation of in-situ processing in laboratory retorts requires control of heat losses. (JRD)
Warm-up with a weighted vest improves running performance via leg stiffness and running economy.
Barnes, K R; Hopkins, W G; McGuigan, M R; Kilding, A E
2015-01-01
To determine the effects of "strides" with a weighted-vest during a warm-up on endurance performance and its potential neuromuscular and metabolic mediators. A bout of resistance exercise can enhance subsequent high-intensity performance, but little is known about such priming exercise for endurance performance. A crossover with 5-7 days between an experimental and control trial was performed by 11 well-trained distance runners. Each trial was preceded by a warm-up consisting of a 10-min self-paced jog, a 5-min submaximal run to determine running economy, and six 10-s strides with or without a weighted-vest (20% of body mass). After a 10-min recovery period, runners performed a series of jumps to determine leg stiffness and other neuromuscular characteristics, another 5-min submaximal run, and an incremental treadmill test to determine peak running speed. Clinical and non-clinical forms of magnitude-based inference were used to assess outcomes. Correlations and linear regression were used to assess relationships between performance and underlying measures. The weighted-vest condition resulted in a very-large enhancement of peak running speed (2.9%; 90% confidence limits ±0.8%), a moderate increase in leg stiffness (20.4%; ±4.2%) and a large improvement in running economy (6.0%; ±1.6%); there were also small-moderate clear reductions in cardiorespiratory measures. Relationships between change scores showed that changes in leg stiffness could explain all the improvements in performance and economy. Strides with a weighted-vest have a priming effect on leg stiffness and running economy. It is postulated the associated major effect on peak treadmill running speed will translate into enhancement of competitive endurance performance. Copyright © 2013 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Hybrid Monte Carlo with Chaotic Mixing
Kadakia, Nirag
2016-01-01
We propose a hybrid Monte Carlo (HMC) technique applicable to high-dimensional multivariate normal distributions that effectively samples along chaotic trajectories. The method is predicated on the freedom of choice of the HMC momentum distribution, and due to its mixing properties, exhibits sample-to-sample autocorrelations that decay far faster than those in the traditional hybrid Monte Carlo algorithm. We test the methods on distributions of varying correlation structure, finding that the proposed technique produces superior covariance estimates, is less reliant on step-size tuning, and can even function with sparse or no momentum re-sampling. The method presented here is promising for more general distributions, such as those that arise in Bayesian learning of artificial neural networks and in the state and parameter estimation of dynamical systems.
Monte Carlo study of real time dynamics
Alexandru, Andrei; Bedaque, Paulo F; Vartak, Sohan; Warrington, Neill C
2016-01-01
Monte Carlo studies involving real time dynamics are severely restricted by the sign problem that emerges from highly oscillatory phase of the path integral. In this letter, we present a new method to compute real time quantities on the lattice using the Schwinger-Keldysh formalism via Monte Carlo simulations. The key idea is to deform the path integration domain to a complex manifold where the phase oscillations are mild and the sign problem is manageable. We use the previously introduced "contraction algorithm" to create a Markov chain on this alternative manifold. We substantiate our approach by analyzing the quantum mechanical anharmonic oscillator. Our results are in agreement with the exact ones obtained by diagonalization of the Hamiltonian. The method we introduce is generic and in principle applicable to quantum field theory albeit very slow. We discuss some possible improvements that should speed up the algorithm.
Multilevel sequential Monte-Carlo samplers
Jasra, Ajay
2016-01-05
Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.
Monte Carlo Simulation for Particle Detectors
Pia, Maria Grazia
2012-01-01
Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...
An enhanced Monte Carlo outlier detection method.
Zhang, Liangxiao; Li, Peiwu; Mao, Jin; Ma, Fei; Ding, Xiaoxia; Zhang, Qi
2015-09-30
Outlier detection is crucial in building a highly predictive model. In this study, we proposed an enhanced Monte Carlo outlier detection method by establishing cross-prediction models based on determinate normal samples and analyzing the distribution of prediction errors individually for dubious samples. One simulated and three real datasets were used to illustrate and validate the performance of our method, and the results indicated that this method outperformed Monte Carlo outlier detection in outlier diagnosis. After these outliers were removed, the value of validation by Kovats retention indices and the root mean square error of prediction decreased from 3.195 to 1.655, and the average cross-validation prediction error decreased from 2.0341 to 1.2780. This method helps establish a good model by eliminating outliers. © 2015 Wiley Periodicals, Inc.
Composite biasing in Monte Carlo radiative transfer
Baes, Maarten; Lunttila, Tuomas; Bianchi, Simone; Camps, Peter; Juvela, Mika; Kuiper, Rolf
2016-01-01
Biasing or importance sampling is a powerful technique in Monte Carlo radiative transfer, and can be applied in different forms to increase the accuracy and efficiency of simulations. One of the drawbacks of the use of biasing is the potential introduction of large weight factors. We discuss a general strategy, composite biasing, to suppress the appearance of large weight factors. We use this composite biasing approach for two different problems faced by current state-of-the-art Monte Carlo radiative transfer codes: the generation of photon packages from multiple components, and the penetration of radiation through high optical depth barriers. In both cases, the implementation of the relevant algorithms is trivial and does not interfere with any other optimisation techniques. Through simple test models, we demonstrate the general applicability, accuracy and efficiency of the composite biasing approach. In particular, for the penetration of high optical depths, the gain in efficiency is spectacular for the spe...
Multilevel Monte Carlo Approaches for Numerical Homogenization
Efendiev, Yalchin R.
2015-10-01
In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.
Monte Carlo simulations on SIMD computer architectures
Energy Technology Data Exchange (ETDEWEB)
Burmester, C.P.; Gronsky, R. [Lawrence Berkeley Lab., CA (United States); Wille, L.T. [Florida Atlantic Univ., Boca Raton, FL (United States). Dept. of Physics
1992-03-01
Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.
Measurements for improvement of running capacity. : Physiological and biomechanical evaluations
Gullstrand, Lennart
2009-01-01
Introduction: Running is included in a large number of sports and one of the most well investigated modes of locomotion in both physiology and biomechanics. This thesis focuses on how some new methods from both areas may be used to capture running capacity in mid-distance and distance running from laboratory and field recordings. Measurement of running economy is included and defined as oxygen uptake at a given submaximal velocity in a steady-state condition. Running economy...
Effects of a minimalist shoe on running economy and 5-km running performance.
Fuller, Joel T; Thewlis, Dominic; Tsiros, Margarita D; Brown, Nicholas A T; Buckley, Jonathan D
2016-09-01
The purpose of this study was to determine if minimalist shoes improve time trial performance of trained distance runners and if changes in running economy, shoe mass, stride length, stride rate and footfall pattern were related to any difference in performance. Twenty-six trained runners performed three 6-min sub-maximal treadmill runs at 11, 13 and 15 km·h(-1) in minimalist and conventional shoes while running economy, stride length, stride rate and footfall pattern were assessed. They then performed a 5-km time trial. In the minimalist shoe, runners completed the trial in less time (effect size 0.20 ± 0.12), were more economical during sub-maximal running (effect size 0.33 ± 0.14) and decreased stride length (effect size 0.22 ± 0.10) and increased stride rate (effect size 0.22 ± 0.11). All but one runner ran with a rearfoot footfall in the minimalist shoe. Improvements in time trial performance were associated with improvements in running economy at 15 km·h(-1) (r = 0.58), with 79% of the improved economy accounted for by reduced shoe mass (P economy and 5-km running performance.
Why forefoot striking in minimal shoes might positively change the course of running injuries
Directory of Open Access Journals (Sweden)
Irene S. Davis
2017-06-01
Full Text Available It is believed that human ancestors evolved the ability to run bipedally approximately 2 million years ago. This form of locomotion may have been important to our survival and likely has influenced the evolution of our body form. As our bodies have adapted to run, it seems unusual that up to 79% of modern day runners are injured annually. The etiology of these injuries is clearly multifactorial. However, 1 aspect of running that has significantly changed over the past 50 years is the footwear we use. Modern running shoes have become increasingly cushioned and supportive, and have changed the way we run. In particular, they have altered our footstrike pattern from a predominantly forefoot strike (FFS landing to a predominantly rearfoot strike (RFS landing. This change alters the way in which the body is loaded and may be contributing to the high rate of injuries runners experience while engaged in an activity for which they were adapted. In this paper, we will examine the benefits of barefoot running (typically an FFS pattern, and compare the lower extremity mechanics between FFS and RFS. The implications of these mechanical differences, in terms of injury, will be discussed. We will then provide evidence to support our contention that FFS provides an optimal mechanical environment for specific foot and ankle structures, such as the heel pad, the plantar fascia, and the Achilles tendon. The importance of footwear will then be addressed, highlighting its interaction with strike pattern on mechanics. This analysis will underscore why footwear matters when assessing mechanics. Finally, proper preparation and safe transition to an FFS pattern in minimal shoes will be emphasized. Through the discussion of the current literature, we will develop a justification for returning to running in the way for which we were adapted to reduce running-related injuries.
Inhomogeneous Monte Carlo simulations of dermoscopic spectroscopy
Gareau, Daniel S.; Li, Ting; Jacques, Steven; Krueger, James
2012-03-01
Clinical skin-lesion diagnosis uses dermoscopy: 10X epiluminescence microscopy. Skin appearance ranges from black to white with shades of blue, red, gray and orange. Color is an important diagnostic criteria for diseases including melanoma. Melanin and blood content and distribution impact the diffuse spectral remittance (300-1000nm). Skin layers: immersion medium, stratum corneum, spinous epidermis, basal epidermis and dermis as well as laterally asymmetric features (eg. melanocytic invasion) were modeled in an inhomogeneous Monte Carlo model.
Handbook of Markov chain Monte Carlo
Brooks, Steve
2011-01-01
""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.
Accelerated Monte Carlo by Embedded Cluster Dynamics
Brower, R. C.; Gross, N. A.; Moriarty, K. J. M.
1991-07-01
We present an overview of the new methods for embedding Ising spins in continuous fields to achieve accelerated cluster Monte Carlo algorithms. The methods of Brower and Tamayo and Wolff are summarized and variations are suggested for the O( N) models based on multiple embedded Z2 spin components and/or correlated projections. Topological features are discussed for the XY model and numerical simulations presented for d=2, d=3 and mean field theory lattices.
A Monte Carlo Uncertainty Analysis of Ozone Trend Predictions in a Two Dimensional Model. Revision
Considine, D. B.; Stolarski, R. S.; Hollandsworth, S. M.; Jackman, C. H.; Fleming, E. L.
1998-01-01
We use Monte Carlo analysis to estimate the uncertainty in predictions of total O3 trends between 1979 and 1995 made by the Goddard Space Flight Center (GSFC) two-dimensional (2D) model of stratospheric photochemistry and dynamics. The uncertainty is caused by gas-phase chemical reaction rates, photolysis coefficients, and heterogeneous reaction parameters which are model inputs. The uncertainty represents a lower bound to the total model uncertainty assuming the input parameter uncertainties are characterized correctly. Each of the Monte Carlo runs was initialized in 1970 and integrated for 26 model years through the end of 1995. This was repeated 419 times using input parameter sets generated by Latin Hypercube Sampling. The standard deviation (a) of the Monte Carlo ensemble of total 03 trend predictions is used to quantify the model uncertainty. The 34% difference between the model trend in globally and annually averaged total O3 using nominal inputs and atmospheric trends calculated from Nimbus 7 and Meteor 3 total ozone mapping spectrometer (TOMS) version 7 data is less than the 46% calculated 1 (sigma), model uncertainty, so there is no significant difference between the modeled and observed trends. In the northern hemisphere midlatitude spring the modeled and observed total 03 trends differ by more than 1(sigma) but less than 2(sigma), which we refer to as marginal significance. We perform a multiple linear regression analysis of the runs which suggests that only a few of the model reactions contribute significantly to the variance in the model predictions. The lack of significance in these comparisons suggests that they are of questionable use as guides for continuing model development. Large model/measurement differences which are many multiples of the input parameter uncertainty are seen in the meridional gradients of the trend and the peak-to-peak variations in the trends over an annual cycle. These discrepancies unambiguously indicate model formulation
An introduction to Monte Carlo methods
Walter, J.-C.; Barkema, G. T.
2015-01-01
Monte Carlo simulations are methods for simulating statistical systems. The aim is to generate a representative ensemble of configurations to access thermodynamical quantities without the need to solve the system analytically or to perform an exact enumeration. The main principles of Monte Carlo simulations are ergodicity and detailed balance. The Ising model is a lattice spin system with nearest neighbor interactions that is appropriate to illustrate different examples of Monte Carlo simulations. It displays a second order phase transition between disordered (high temperature) and ordered (low temperature) phases, leading to different strategies of simulations. The Metropolis algorithm and the Glauber dynamics are efficient at high temperature. Close to the critical temperature, where the spins display long range correlations, cluster algorithms are more efficient. We introduce the rejection free (or continuous time) algorithm and describe in details an interesting alternative representation of the Ising model using graphs instead of spins with the so-called Worm algorithm. We conclude with an important discussion of the dynamical effects such as thermalization and correlation time.
The Robust Running Ape: Unraveling the Deep Underpinnings of Coordinated Human Running Proficiency
Directory of Open Access Journals (Sweden)
John Kiely
2017-06-01
Full Text Available In comparison to other mammals, humans are not especially strong, swift or supple. Nevertheless, despite these apparent physical limitations, we are among Natures most superbly well-adapted endurance runners. Paradoxically, however, notwithstanding this evolutionary-bestowed proficiency, running-related injuries, and Overuse syndromes in particular, are widely pervasive. The term ‘coordination’ is similarly ubiquitous within contemporary coaching, conditioning, and rehabilitation cultures. Various theoretical models of coordination exist within the academic literature. However, the specific neural and biological underpinnings of ‘running coordination,’ and the nature of their integration, remain poorly elaborated. Conventionally running is considered a mundane, readily mastered coordination skill. This illusion of coordinative simplicity, however, is founded upon a platform of immense neural and biological complexities. This extensive complexity presents extreme organizational difficulties yet, simultaneously, provides a multiplicity of viable pathways through which the computational and mechanical burden of running can be proficiently dispersed amongst expanded networks of conditioned neural and peripheral tissue collaborators. Learning to adequately harness this available complexity, however, is a painstakingly slowly emerging, practice-driven process, greatly facilitated by innate evolutionary organizing principles serving to constrain otherwise overwhelming complexity to manageable proportions. As we accumulate running experiences persistent plastic remodeling customizes networked neural connectivity and biological tissue properties to best fit our unique neural and architectural idiosyncrasies, and personal histories: thus neural and peripheral tissue plasticity embeds coordination habits. When, however, coordinative processes are compromised—under the integrated influence of fatigue and/or accumulative cycles of injury, overuse
Energy Technology Data Exchange (ETDEWEB)
Seitz, M.G.
1982-01-01
Reviewed in this statement are methods of preparing solutions to be used in laboratory experiments to examine technical issues related to the safe disposal of nuclear waste from power generation. Each approach currently used to prepare solutions has advantages and any one approach may be preferred over the others in particular situations, depending upon the goals of the experimental program. These advantages are highlighted herein for three approaches to solution preparation that are currently used most in studies of nuclear waste disposal. Discussion of the disadvantages of each approach is presented to help a user select a preparation method for his particular studies. Also presented in this statement are general observations regarding solution preparation. These observations are used as examples of the types of concerns that need to be addressed regarding solution preparation. As shown by these examples, prior to experimentation or chemical analyses, laboratory techniques based on scientific knowledge of solutions can be applied to solutions, often resulting in great improvement in the usefulness of results.
Las obras de acondicionamiento del Salto del Mont-Cenis
Directory of Open Access Journals (Sweden)
Vié, Georges
1969-06-01
Full Text Available The new dam at Mont Cenis has increased to water volume in the reservoir from 32 to 320 million m3 and provides a potential energy of 650.000 million Mp x m. This article describes the work involved in the construction of this project. Among other features, the water has been led to the reservoir along a network of varying cross-section, and of 28 km total length. The pressure conduit has been designed for a flow rate of 51 m3/s, its diameter varying between 4 and 5 m. The new power stations at Villarodin are fed by a pipe 3.6 km in length and 3 m in diameter. This conduit runs along a trench and is isolated from the ground, because of the gypsum and anhydrites contained in the soil. Its weight is 10.000 Mp, which is a European record. The power station comprises to vertical Pelton turbines of 195 MVA each, and a yearly output of 485 kWh.La capacidad de la nueva presa del Mont-Cenis ha aumentado con estas obras desde 32 hasta 320 millones de m3, permitiendo el abastecimiento de una energía potencial de 650.000 millones de Mp x m. En el artículo se describen las obras y operaciones necesarias para ello; entre otras, que las aguas captadas han sido llevadas al embalse por una red de galerías de sección variable de 28 km de longitud; y que la conducción forzada, de 18 km de longitud, fue dimensionada para un caudal de 51 m3/s, con un diámetro comprendido entre 4 y 5 metros. Las nuevas centrales de Villarodin están alimentadas por una desviación de 3,6 km de longitud y 3 m de diámetro. Este conducto, colocado en un foso inclinado para aislarlo del ter reno, en esa zona de yesos y anhidritas, principalmente, tiene un peso de 10.000 Mp, lo que constituye un récord europeo. La Central comprende dos grupos verticales de turbinas Pelton de 195 MVA cada uno, con una productividad de 485 millones de kWh.
A Statistical Perspective on Running with Prosthetic Lower-Limbs: An Advantage or Disadvantage?
Directory of Open Access Journals (Sweden)
Hossein Hassani
2014-11-01
Full Text Available Technological developments have led to the increased use of carbon fiber and prosthetic lower-limbs in running events at the Paralympic Games. This study aims to exploit a series of statistical techniques in order to prepare a response to the vital question of whether utilizing prosthetic feet can affect an athletes ability when running competitively at the Paralympics Games by comparing both within and between different classifications. The study also considers the differences between running on biological limbs and prosthetic lower-limbs from a mechanical point of view. The results from the male 100 m, 200 m and 400 m at the 2012 London Paralympic Games have been the source of this investigation. The investigation provides statistical evidence to propose that the number of prosthetic limbs used and the structure of such limbs have a significant impact on the outcome of track events at the Paralympic Games.
The Effects of Backwards Running Training on Forward Running Economy in Trained Males.
Ordway, Jason D; Laubach, Lloyd L; Vanderburgh, Paul M; Jackson, Kurt J
2016-03-01
Backwards running (BR) results in greater cardiopulmonary response and muscle activity compared with forward running (FR). BR has traditionally been used in rehabilitation for disorders such as stroke and lower leg extremity injuries, as well as in short bursts during various athletic events. The aim of this study was to measure the effects of sustained backwards running training on forward running economy in trained male athletes. Eight highly trained, male runners (26.13 ± 6.11 years, 174.7 ± 6.4 cm, 68.4 ± 9.24 kg, 8.61 ± 3.21% body fat, 71.40 ± 7.31 ml·kg(-1)·min(-1)) trained with BR while harnessed on a treadmill at 161 m·min(-1) for 5 weeks following a 5-week BR run-in period at a lower speed (134 m·min(-1)). Subjects were tested at baseline, postfamiliarized, and post-BR training for body composition, a ramped VO2max test, and an economy test designed for trained male runners. Subjects improved forward running economy by 2.54% (1.19 ± 1.26 ml·kg(-1)·min(-1), p = 0.032) at 215 m·min(-1). VO2max, body mass, lean mass, fat mass, and % body fat did not change (p > 0.05). Five weeks of BR training improved FR economy in healthy, trained male runners without altering VO2max or body composition. The improvements observed in this study could be a beneficial form of training to an already economical population to improve running economy.
Moore, Isabel S
2016-06-01
Running economy (RE) has a strong relationship with running performance, and modifiable running biomechanics are a determining factor of RE. The purposes of this review were to (1) examine the intrinsic and extrinsic modifiable biomechanical factors affecting RE; (2) assess training-induced changes in RE and running biomechanics; (3) evaluate whether an economical running technique can be recommended and; (4) discuss potential areas for future research. Based on current evidence, the intrinsic factors that appeared beneficial for RE were using a preferred stride length range, which allows for stride length deviations up to 3 % shorter than preferred stride length; lower vertical oscillation; greater leg stiffness; low lower limb moment of inertia; less leg extension at toe-off; larger stride angles; alignment of the ground reaction force and leg axis during propulsion; maintaining arm swing; low thigh antagonist-agonist muscular coactivation; and low activation of lower limb muscles during propulsion. Extrinsic factors associated with a better RE were a firm, compliant shoe-surface interaction and being barefoot or wearing lightweight shoes. Several other modifiable biomechanical factors presented inconsistent relationships with RE. Running biomechanics during ground contact appeared to play an important role, specifically those during propulsion. Therefore, this phase has the strongest direct links with RE. Recurring methodological problems exist within the literature, such as cross-comparisons, assessing variables in isolation, and acute to short-term interventions. Therefore, recommending a general economical running technique should be approached with caution. Future work should focus on interdisciplinary longitudinal investigations combining RE, kinematics, kinetics, and neuromuscular and anatomical aspects, as well as applying a synergistic approach to understanding the role of kinetics.
Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method
2002-01-01
This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.
Belo Monte hydropower project: actual studies; AHE Belo Monte: os estudos atuais
Energy Technology Data Exchange (ETDEWEB)
Figueira Netto, Carlos Alberto de Moya [CNEC Engenharia S.A., Sao Paulo, SP (Brazil); Rezende, Paulo Fernando Vieira Souto [Centrais Eletricas Brasileiras S.A. (ELETROBRAS), Rio de Janeiro, RJ (Brazil)
2008-07-01
This article presents the evolution of the studies of Belo Monte Hydro Power Project (HPP) since the initial inventory studies of the Xingu River in 1979 until the current studies for conclusion of the Technical, Economic and Environmental Feasibility Studies the Belo Monte Hydro Power Project, as authorized by Brazilian National Congress. The current studies characterize the Belo Monte HPP with an installed capacity of 11,181.3 MW (20 units of 550 MW in the main power house and 7 units of 25.9 MW in the additional power house), connected to the Brazilian Interconnected Power Grid, allowing to generate 4,796 mean MW of firm energy, without depending on any flow rate regularization of the upstream Xingu river flooding only 441 k m2, of which approximately 200 k m2, correspond to the normal annual wet season flooding of the Xingu River. (author)
Díez, A; Largo, J; Solana, J R
2006-08-21
Computer simulations have been performed for fluids with van der Waals potential, that is, hard spheres with attractive inverse power tails, to determine the equation of state and the excess energy. On the other hand, the first- and second-order perturbative contributions to the energy and the zero- and first-order perturbative contributions to the compressibility factor have been determined too from Monte Carlo simulations performed on the reference hard-sphere system. The aim was to test the reliability of this "exact" perturbation theory. It has been found that the results obtained from the Monte Carlo perturbation theory for these two thermodynamic properties agree well with the direct Monte Carlo simulations. Moreover, it has been found that results from the Barker-Henderson [J. Chem. Phys. 47, 2856 (1967)] perturbation theory are in good agreement with those from the exact perturbation theory.
Takahashi, F; Endo, A
2007-01-01
A system utilising radiation transport codes has been developed to derive accurate dose distributions in a human body for radiological accidents. A suitable model is quite essential for a numerical analysis. Therefore, two tools were developed to setup a 'problem-dependent' input file, defining a radiation source and an exposed person to simulate the radiation transport in an accident with the Monte Carlo calculation codes-MCNP and MCNPX. Necessary resources are defined by a dialogue method with a generally used personal computer for both the tools. The tools prepare human body and source models described in the input file format of the employed Monte Carlo codes. The tools were validated for dose assessment in comparison with a past criticality accident and a hypothesized exposure.
Prevalence of Injury in Ultra Trail Running
Directory of Open Access Journals (Sweden)
Malliaropoulos Nikolaos
2015-06-01
Full Text Available Purpose. The purpose of the study was to find the rate of musculoskeletal injuries in ultra-trail runners, investigate the most sensitive anatomical areas, and discover associated predicting factors to aid in the effective prevention and rapid rehabilitation of trail running injuries. Methods. Forty ultra trail runners responded to an epidemiological questionnaire. Results. At least one running injury was reported by 90% of the sample, with a total of 135 injuries were reported (111 overuse injuries, 24 appeared during competing. Lower back pain was the most common source of injury (42.5%. Running in the mountains (p = 0.0004 and following a personalized training schedule (p = 0.0995 were found to be protective factors. Runners involved in physical labor are associated with more injuries (p = 0.058. Higher-level runners are associated with more injuries than lower-level cohorts (p = 0.067, with symptoms most commonly arising in the lower back (p = 0.091, hip joint (p = 0.083, and the plantar surface of the foot (p = 0.054. Experienced runners (> 6 years are at greater risk of developing injuries (p = 0.001, especially in the lower back (p = 0.012, tibia (p = 0.049, and the plantar surface of the foot (p = 0 .028. Double training sessions could cause hip joint injury (p = 0.060. Conclusions. In order to avoid injury, it is recommended to train mostly on mountain trails and have a training program designed by professionals.
Blood glutathione status following distance running.
Dufaux, B; Heine, O; Kothe, A; Prinz, U; Rost, R
1997-02-01
In 12 moderately trained subjects reduced glutathione (GSH) and oxidized glutathione (GSSG) as well as thiobarbituric acid reactive substances (TBARS) were measured in the blood before and during the first two hours and first two days after a 2.5-h run. The participants covered between 19 and 26 km (20.8 +/- 2.5 km, mean +/- SD). The running speed was between 53 and 82% of the speed at which blood lactate concentration reached 4 mmol/L lactate (67.9 +/- 8.2%, mean +/- SD) assessed during a previously performed treadmill test. Blood samples were collected 1 h before, immediately before, immediately after, 1 and 2 h after, as well as 1 and 2 days after the run. Immediately after exercise GSH was significantly decreased (p < 0.01) and GSSG significantly increased (p < 0.01). In all subjects the ratio of GSH to GSSG showed a marked decline to 18 +/- 4% (mean +/- SD) of the pre-exercise values (p < 0.01). One hour later the mean GSH and GSSG values returned to baseline. However, there were considerable inter-individual differences. In some subjects the GSH/ GSSG ratio overshot the pre-exercise levels, in others the ratio remained low even two hours after exercise. Compared with the pre-exercise values TBARS concentrations did not change significantly at any time point after exercise. The findings suggest that after prolonged exercise in moderately trained subjects a critical shift in the blood glutathione redox status may be reached. The changes observed were generally short-lived, the duration of which may have depended on the relative importance of reactive oxygen species generation by the capillary endothelial cells and neutrophil and eosinophil granulocytes after the end of exercise.
Towards a measurement of the spectral runnings
Muñoz, Julian B.; Kovetz, Ely D.; Raccanelli, Alvise; Kamionkowski, Marc; Silk, Joseph
2017-05-01
Single-field slow-roll inflation predicts a nearly scale-free power spectrum of perturbations, as observed at the scales accessible to current cosmological experiments. This spectrum is slightly red, showing a tilt (1-ns)~ 0.04. A direct consequence of this tilt are nonvanishing runnings αs= d ns/ dlog k, and βs= dαs/ dlog k, which in the minimal inflationary scenario should reach absolute values of 10-3 and 10-5, respectively. In this work we calculate how well future surveys can measure these two runnings. We consider a Stage-4 (S4) CMB experiment and show that it will be able to detect significant deviations from the inflationary prediction for αs, although not for βs. Adding to the S4 CMB experiment the information from a WFIRST-like or a DESI-like survey improves the sensitivity to the runnings by ~ 20%, and 30%, respectively. A spectroscopic survey with a billion objects, such as the SKA, will add enough information to the S4 measurements to allow a detection of αs=10-3, required to probe the single-field slow-roll inflationary paradigm. We show that only a very-futuristic interferometer targeting the dark ages will be capable of measuring the minimal inflationary prediction for βs. The results of other probes, such as a stochastic background of gravitational waves observable by LIGO, the Ly-α forest, and spectral distortions, are shown for comparison. Finally, we study the claims that large values of βs, if extrapolated to the smallest scales, can produce primordial black holes of tens of solar masses, which we show to be easily testable by the S4 CMB experiment.
Contribution of trunk muscularity on sprint run.
Kubo, T; Hoshikawa, Y; Muramatsu, M; Iida, T; Komori, S; Shibukawa, K; Kanehisa, H
2011-03-01
This study aimed to investigate how the trunk muscularity is related to sprint running performance. In 23 youth soccer players, the cross-sectional images at the mid level of each of L1-L2, L2-L3, L3-L4, L4-L5, and L5-S1 were obtained using magnetic resonance imaging to determine the cross-sectional areas (CSAs) of rectus abdominis, oblique, psoas major, quadratus lumborum and erector spinae muscles. The times taken to sprint over 20 m were measured, and the mean velocity of running was calculated for each of the 2 distances (V (10 m) and V (20 m)) and for the distance from 10 m to 20 m (V (10-20 m)). The CSA values of the 5 slice levels for all muscles except for the quadratus lumborum and those of the 3 slice levels (L1-L2, L2-L3 and L3-L4) for the quadratus lumborum were averaged and expressed relative to the two-third power of body mass (CSA/BM (2/3)). The CSA/BM (2/3) values of the erector spinae and quadratus lumborum were selected as significant contributors to predict V (10 m) ( R(2)=0.450), V (20 m) ( R(2)=0.504) and V (10-20 m) ( R(2)=0.420). The current results indicate that the muscularity of the erector spinae and quadratus lumborum contributes to achieving a high performance in sprint running over distances of less than 20 m.
Running on Empty? The Compensatory Reserve Index
2013-12-01
Running on empty? The compensatory reserve index Steven L. Moulton, MD, Jane Mulligan , PhD, Greg Z. Grudic, PhD, and Victor A. Convertino, PhD, San...reserve index. 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Moulton S. L., Mulligan J., Grudic G. Z., Convertino V...2003;196:679Y684. 25. Convertino VA, Grudic GZ, Mulligan J, Moulton S. Estimation of individual-specific progression to cardiovascular instability using
The Running Barbed Tie-over Dressing
Directory of Open Access Journals (Sweden)
Cormac W. Joyce, MB, BCh, MRCSI
2014-04-01
Full Text Available Summary: Barbed suture technology is becoming increasingly popular in plastic surgery and is now being used in body contouring surgery and facial rejuvenation. We describe the novel application of a barbed suture as a running tie-over dressing for skin grafts. The barbs act as anchors in the skin, so constant tensioning of the suture is not required. The bidirectional nature of the suture prevents any slippage, and the barbs even act as a grip on the underlying wool dressing. Furthermore, the method described is both quick and simple to learn and would be useful for the sole operator.
Giordano, Ferdinando
2016-01-01
After a period of maintenance the LHC was restarted in 2015 delivering p-p collision at a new center of mass energy of 13 TeV, this new achievement by the machine opened the phase space of many searches for physics beyond the standard model (BSM). In this talk a summary of the LHC searches for supersymmetry (SUSY) pursued by the ATLAS and CMS collaborations is presented, covering a broad number of models and scenarios. Even at this early stage the new searches greatly extend the reach of the previous Run1 analyses limiting the phase space for natural SUSY to exist.
LHCb: The LHCb Silicon Tracker: Running experience
Saornil Gamarra, S
2012-01-01
The LHCb Silicon Tracker is part of the main tracking system of the LHCb detector at the LHC. It measures very precisely the particle trajectories coming from the interaction point in the region of high occupancies around the beam axis. After presenting our production and comissioning issues in TWEPP 2008, we report on our running experience. Focusing on electronic and hardware issues as well as operation and maintenance adversities, we describe the lessons learned and the pitfalls encountered after three years of successful operation.
Analysis of Biomechanical Factors in Bend Running
Directory of Open Access Journals (Sweden)
Bing Zhang
2013-03-01
Full Text Available Sprint running is the demonstration of comprehensive abilities of technology and tactics, under various conditions. However, whether it is just to allocate the tracks for short-distance athletes from different racetracks has been the hot topic. This study analyzes its forces, differences in different tracks and winding influences, in the aspects of sport biomechanics. The results indicate, many disadvantages exist in inner tracks, middle tracks are the best and outer ones are inferior to middle ones. Thus it provides references for training of short-distance items in biomechanics and psychology, etc.
DeJager, Nathan R.
2017-01-01
The data are input data files to run the forest simulation model Landis-II for Isle Royale National Park. Files include: a) Initial_Comm, which includes the location of each mapcode, b) Cohort_ages, which includes the ages for each tree species-cohort within each mapcode, c) Ecoregions, which consist of different regions of soils and climate, d) Ecoregion_codes, which define the ecoregions, and e) Species_Params, which link the potential establishment and growth rates for each species with each ecoregion.
ATLAS Run-2 status and performance
Pastore, Francesca; The ATLAS collaboration
2015-01-01
During the 2013/2014 shutdown of the LHC the ATLAS detector has been improved. A new silicon pixel detector layers has been installed, and the muon detector coverage has been improved substantially. In addition nearly all other parts of the detector have also been revised to adapt them to the higher pileup conditions or make them more robust in general. This talk will describe these improvements, and how they affect the performance of physics objects. The initial results showing the detector performance as obtained from cosmic runs and/or initial beam data will also be shown.
Energy Technology Data Exchange (ETDEWEB)
Serikov, A.; Fischer, U.; Grosse, D.; Leichtle, D.; Majerle, M., E-mail: arkady.serikov@kit.edu [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany)
2011-07-01
The Monte Carlo (MC) method is the most suitable computational technique of radiation transport for shielding applications in fusion neutronics. This paper is intended for sharing the results of long term experience of the fusion neutronics group at Karlsruhe Institute of Technology (KIT) in radiation shielding calculations with the MCNP5 code for the ITER fusion reactor with emphasizing on the use of several ITER project-driven computer programs developed at KIT. Two of them, McCad and R2S, seem to be the most useful in radiation shielding analyses. The McCad computer graphical tool allows to perform automatic conversion of the MCNP models from the underlying CAD (CATIA) data files, while the R2S activation interface couples the MCNP radiation transport with the FISPACT activation allowing to estimate nuclear responses such as dose rate and nuclear heating after the ITER reactor shutdown. The cell-based R2S scheme was applied in shutdown photon dose analysis for the designing of the In-Vessel Viewing System (IVVS) and the Glow Discharge Cleaning (GDC) unit in ITER. Newly developed at KIT mesh-based R2S feature was successfully tested on the shutdown dose rate calculations for the upper port in the Neutral Beam (NB) cell of ITER. The merits of McCad graphical program were broadly acknowledged by the neutronic analysts and its continuous improvement at KIT has introduced its stable and more convenient run with its Graphical User Interface. Detailed 3D ITER neutronic modeling with the MCNP Monte Carlo method requires a lot of computation resources, inevitably leading to parallel calculations on clusters. Performance assessments of the MCNP5 parallel runs on the JUROPA/HPC-FF supercomputer cluster permitted to find the optimal number of processors for ITER-type runs. (author)
CMS Collaboration
2016-01-01
Estimates of absorbed dose in HCAL Endcap (HE) region as predicted by FLUKA Monte Carlo code. Dose is calculated in an R-phi-Z grid overlaying HE region, with resolution 1cm in R, 1mm in Z, and a single 360 degree bin in phi. This allows calculation of absorbed dose within a single 4mm thick scintillator layer without including other regions or materials. This note shows estimates of the cumulative dose in scintillator layers 1 and 7 during the 2012 run.